Adaptive illumination for a time-of-flight camera on a vehicle
11523067 · 2022-12-06
Assignee
Inventors
Cpc classification
B60R11/04
PERFORMING OPERATIONS; TRANSPORTING
H04N25/533
ELECTRICITY
H04N23/74
ELECTRICITY
G05D1/0088
PHYSICS
International classification
G05D1/00
PHYSICS
Abstract
Disclosed are devices, systems and methods for capturing an image. In one aspect an electronic camera apparatus includes an image sensor with a plurality of pixel regions. The apparatus further includes an exposure controller. The exposure controller determines, for each of the plurality of pixel regions, a corresponding exposure duration and a corresponding exposure start time. Each pixel region begins to integrate incident light starting at the corresponding exposure start time and continues to integrate light for the corresponding exposure duration. In some example embodiments, at least two of the corresponding exposure durations or at least two of the corresponding exposure start times are different in the image.
Claims
1. A method of generating an image using an electronic camera, comprising: determining, for each object in one or more objects imaged by an image sensor of the camera, a distance between the object and the image sensor using a distance sensor; associating, for each object in the one or more objects, a pixel region of the image sensor with the object; determining, for each pixel region associated with an object in the one or more objects, a start time for exposure of the pixel region to light incoming to the pixel region using the distance determined for the object; determining, for each pixel region associated with an object in the one or more objects, an exposure duration; and gating each pixel region associated with an object in the one or more objects using a gating circuit associated with the pixel region, wherein the gating circuit is configured to begin exposure of the pixel region at the exposure start time determined for the pixel region and continue the exposure for the exposure duration determined for the pixel region, wherein at least two of the determined exposure durations are different from each other or at least two of the determined exposure start times are different from each other.
2. The method of claim 1, comprising associating a first pixel region with a first object in the one or more objects and associating the first pixel region with a second object in the one or more objects, determining a first exposure start time using a first distance determined for the first object and determining a second exposure start time using a second distance determined for the second object, wherein the first distance is different from the second distance.
3. The method of claim 2, comprising gating the first pixel region using the gating circuit associated with the first pixel region such that the gating circuit begins exposure of the first pixel region at the first exposure start time, continues the exposure for a first exposure duration, begins exposure of the first pixel region again at the second exposure start time, and continues the exposure for a second exposure duration during obtaining a single image using the image sensor.
4. The method of claim 1, comprising activating a light source to illuminate a scene viewed by the camera.
5. The method of claim 4, wherein the light source is flash type light source.
6. The method of claim 4, wherein the exposure start time determined for a pixel region associated with an object in the one or more objects corresponds to a time of flight of light from the light source to the object and back from the object to the image sensor of the camera.
7. The method of claim 1, wherein, for a current image, the exposure duration determined for a pixel region among the pixel regions associated with the one or more objects is determined using one or more images obtained using the image sensor, and wherein the one or more images immediately precede the current image.
8. The method of claim 1, wherein the exposure duration for a pixel region among the pixel regions associated with the one or more objects is determined using a light intensity at the pixel region.
9. A non-transitory computer readable storage medium encoded with executable instructions for operating an electronic camera that, when executed by at least one processor, cause the at least one processor to perform operations, comprising: determine, for each object in one or more objects imaged by an image sensor of the camera, a distance between the object and the image sensor using a distance sensor; associate, for each object in the one or more objects, a pixel region of the image sensor with the object; determine, for each pixel region associated with an object in the one or more objects, a start time for exposure of the pixel region to light incoming to the pixel region using the distance determined for the object; determine, for each pixel region associated with an object in the one or more objects, an exposure duration; and gate each pixel region associated with an object in the one or more objects using a gating circuit associated with the pixel region, wherein the gating circuit is configured to begin exposure of the pixel region at the exposure start time determined for the pixel region and continue the exposure for the exposure duration determined for the pixel region, wherein at least two of the determined exposure durations are different from each other or at least two of the determined exposure start times are different from each other.
10. The non-transitory computer readable storage medium of claim 9, wherein the executable instructions cause the at least one processor to perform operations, comprising: associate a first pixel region with a first object in the one or more objects and associate the first pixel region with a second object in the one or more objects, determine a first exposure start time using a first distance determined for the first object and determine a second exposure start time using a second distance determined for the second object, wherein the first distance is different from the second distance.
11. The non-transitory computer readable storage medium of claim 10, wherein the executable instructions cause the at least one processor to perform operations, comprising: gate the first pixel region using the gating circuit associated with the first pixel region such that the gating circuit begins exposure of the first pixel region at the first exposure start time, continues the exposure for a first exposure duration, begins exposure of the first pixel region again at the second exposure start time, and continues the exposure for a second exposure duration during obtaining a single image using the image sensor.
12. The non-transitory computer readable storage medium of claim 9, wherein the executable instructions cause the at least one processor to perform operations, comprising: activate a light source to illuminate a scene viewed by the camera.
13. The non-transitory computer readable storage medium of claim 12, wherein the light source is a flash type light source.
14. The non-transitory computer readable storage medium of claim 12, wherein the exposure start time determined for a pixel region associated with an object in the one or more objects corresponds to a time of flight of light from the light source to the object and back from the object to the image sensor of the camera.
15. An electronic camera apparatus, comprising: an image sensor with a plurality of pixel regions; an exposure controller configured to control, for each pixel region in the plurality of pixel regions, an exposure duration and an exposure start time corresponding to the pixel region; and a plurality of gating circuits, wherein each gating circuit is configured to gate a pixel region in the plurality of pixel regions to begin exposure of the pixel region at the exposure start time corresponding to the pixel region and continue the exposure for the exposure duration corresponding to the pixel region, wherein each pixel region is configured to begin collecting incoming light starting at the corresponding exposure start time and continue collecting the incoming light during the corresponding exposure duration, wherein the exposure start time for a pixel region in the plurality of pixel regions is determined using a distance between the image sensor and an object imaged by the pixel region, and wherein, for the plurality of pixel regions, at least two of the determined exposure durations are different from each other or at least two of the determined exposure start times are different from each other.
16. The apparatus of claim 15, wherein, for a first pixel region in the plurality of pixel regions, the exposure controller is configured to determine a first exposure start time using a first distance between the image sensor and a first object imaged by the first pixel region and determine a second exposure start time using a second distance between the image sensor and a second object imaged by the first pixel region, wherein the first distance is different from the second distance.
17. The apparatus of claim 16, wherein a gating circuit in the plurality of gating circuits is configured to begin exposure of the first pixel region at the first exposure start time, continue the exposure for a first exposure duration, begin exposure of the first pixel region again at the second exposure start time, and continue the exposure for a second exposure duration during obtaining a single image using the image sensor.
18. The apparatus of claim 15, wherein, for an object imaged in a single image by several pixel regions in the plurality of pixel regions, all pixel regions in the several pixel regions have a same corresponding exposure start time.
19. The apparatus of claim 15, wherein the exposure duration corresponding to a pixel region in the plurality of pixel regions is determined using a light intensity at the pixel region.
20. The apparatus of claim 15, wherein the distance is determined using at least one of: a LiDAR sensor, a RADAR sensor, or an ultrasonic sensor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
DETAILED DESCRIPTION
(5) Visual perception in self-driving or autonomous vehicles requires high-reliability cameras in various environments such as environments that are backlit, have low lighting, and so on. Because of imbalanced light distribution or an insufficient amount of light, the setting of a global shutter speed for the entire image sensor that is imaging a scene may not be feasible. Consistent with the disclosed subject matter is a time-of-flight (ToF) camera such as a Complementary Metal Oxide Semiconductor (CMOS)/Charge-Coupled Device (CCD) camera where the exposure to light of individual pixels, or regions of pixels, can be controlled to achieve different exposures for the different pixels or regions of the image sensor. Different areas in the resulting image may be generated with different exposure start times and/or exposure durations. The different exposure start times may be determined in part from different distances between the image sensor and the objects in the image determined from a pre-built high-definition (HD) map or distance sensors such as LiDAR, RADAR, or ultrasonic distance sensors. In some example embodiments, a light source such as a flash is initiated. From the known location of the flash and the distances between the flash and the various objects, arrival times at the image sensor for the light from the objects can be determined. Other sensors may determine the light intensity received from the different objects. From the intensities of the light from the various objects, the amount of time the electronic shutter should be open for each pixel region can be determined.
(6) The techniques described in this patent document provide shutter for individual pixels or regions of pixels. The electronic shutter control incorporates information from a high-definition map and/or distance sensors to provide better camera perception in poor lighting conditions thereby improving camera performance in autonomous driving applications.
(7)
(8) In the example of scene 110, objects are included such as tree 114 and block 112. Tree 114 is closer to image sensor 120 than is box 112. Other objects may be in scene 110 (not shown) such as roads, buildings, other vehicles, road signs, bicycles, pedestrians, and the like. Some objects may be at the same distance form image sensor 120 and other objects may be at different distances. Some objects may be more brightly illuminated than other objects due to their proximity to a light source, proximity to the image sensor, and/or the reflectivity of the object. Generally closer and/or more reflective objects appear brighter at the image sensor and farther and/or less reflective objects appear less bright at the image sensor.
(9) Image sensor 120 captures an image of the scene 110 including box 112 and tree 114 as well as other objects which are at different distances from image sensor 120. Image sensor 120 may be a two-dimensional array of pixels where each pixel generates an electrical signal related to the light intensity impinging on the pixel. When combined together, the electrical signals corresponding to the individual pixels produce an electrical representation of the image of scene 110. Image sensor 120 may include 1,000,000 pixels or any other number of pixels. Although not shown, optical components such as lenses, apertures, and/or mechanical shutters may be included in the camera which includes image sensor 120. Image sensor 120 may provide for applying an electronic shutter to individual pixels or regions of pixels. An electronic shutter is an enabling signal, or a gate, that enables a corresponding pixel or pixel region to capture incident light when the electronic shutter is open (or the enabling signal or gate is active) and to not capture incident light when the electronic shutter is closed (or the enabling signal or gate is inactive). Shown at 120 is a two-dimensional array of pixel regions including pixel regions 122, 124, 126, 128, and others. Image sensor 120 may be a charge-coupled device or any other type of image sensor that is electronically gated or shuttered.
(10) Pixel region controller 130 controls the exposure of the pixel regions (or individual pixels) of the image sensor 120. Shown in
(11) Exposure control module 140 determines the exposure start times for the various pixel regions of image sensor 120—a time when the exposure starts for each pixel region. Exposure control module 140 may receive distance information related to the objects in scene 110 from one or more sensors such as RADAR sensor 154, LiDAR sensor 152, ultrasonic sensor 156, and/or high-definition map 150 as further detailed below. For example, exposure control module 140 can determine when after a flash is initiated at the camera system 100 that the electronic shutter should open for each pixel region based on the known speed of light, the known distance between the camera sensor 120 and objects imaged at each pixel region. As an illustrative example, tree 114 may be imaged by one or more pixel regions such as pixel region 122. After a light flash (not shown in
(12) Exposure control module 140 may receive distance information related to scene 110 from one or more sensors such as RADAR sensor 154, LiDAR sensor 152, ultrasonic sensor 156 and/or high-definition map 150. As illustrative examples, exposure control module 140 may receive from LiDAR sensor 152 a distance that tree 114 is from image sensor 120; exposure control module 140 may receive from RADAR sensor 154 a distance that box 112 is from image sensor 120; exposure control module 140 may receive from ultrasonic sensor 156 distances that box 112 and tree 114 are from image sensor 120. Exposure control module 140 uses the distance information to determine the exposure start times for the pixel regions of the image sensor 120.
(13) Exposure control module 140 determines an exposure duration for each of the various pixel regions—a length of time that the electronic shutter remains open for each pixel region after opening at each corresponding exposure start time. For example, the electronic shutter corresponding to pixel region 126 may open when light propagates from a flash to tree 112, is reflected, and propagates back to pixel region 126. In this example, the image of box 112 at pixel region 126 may be brighter that the image of tree 114 at pixel region 122. Accordingly, the exposure duration for pixel region 126 may be shorter than the exposure duration for pixel region 122. The brightness may be determined from signal levels of the pixels or pixel regions in a recent previous image. As an illustrative example, when images are taken at 30 frames per second, there are 33 milliseconds between frames and the image seen from an autonomous vehicle changes little in 33 milliseconds. In this case, pixel values in the previous image corresponding to the intensities in the previous image may be used to determine the exposure duration for the same pixel region in a later image. For example, in a first image, a pixel region may be saturated or otherwise negatively affected due to too much light exposure which may be determined using image processing of the values from the pixel region. In this example, on the next image of the same scene, the exposure duration may be reduced to prevent saturation or other negative impacts based on the values in the previous image. Objects that appear brighter at the image sensor 120 can have a shorter exposure duration. The exposure duration for each pixel region may be adjusted on successive images and the exposure duration of each pixel region may be adjusted independently from the other pixel regions. In this way, different pixel regions may have different exposure durations to generate an image that includes the correct exposure for all pixel regions.
(14) Exposure control module 140 may receive distance information from high-definition map 150 which may include a map of known objects in the vicinity of a GPS location provided to the high-definition map 150. The high-definition map 150 may contain detailed information about roads, lanes on the roads, traffic lights, stop signs, buildings, and so on. Using GPS information locating the autonomous vehicle and camera system, the vehicle can be placed in the high-definition map 150. Accordingly, the objects stored in the map at the location of the vehicle can be located relative to the vehicle. Distances to the objects can be determined from the map and the vehicles location. The vehicle's location on the map can be updated as the vehicle moves. In some embodiments, the vehicle's location may be advanced to a later time on the map using a location, velocity, and acceleration information at the location at a starting time. In this way, the location of the vehicle may be determined less frequently and the accuracy of the vehicle location between successive location updates can be improved.
(15)
(16)
(17) At 310, the process includes receiving distances from one or more distance sensors between an image sensor and objects imaged by the image sensor. For example, a LiDAR distance sensor may determine a first distance between the image sensor and a first object. An ultrasonic distance sensor may determine a second distance. A RADAR distance sensor may determine a third distance and/or may provide another distance measurement to an object whose distance is also determined by another distance sensor. In some example embodiments, the distance may be determined to a fixed object associated with the image sensor and the distance may be corrected by the fixed offset distance.
(18) At 320, the process includes associating one or more pixel regions of the image sensor with each of the objects. As an illustrative example, an image sensor may have 100,000 pixel regions, each region including 100 pixels for a total of 10,000,000 pixels in the image sensor. In this example, a car is imaged on the image sensor. The image of the car occupies 1,000 pixel regions which are associated with the car object. The car is a distance from the image sensor that is determined by one or more distance sensors. Other objects may be imaged on the image sensor occupying a number of different pixel regions dependent on the size of the object and its proximity to the image sensor. Associated with the other objects that are various distances from the image sensor that are determined by the one or more distance sensors.
(19) At 330, the process includes determining an exposure start time for each of the pixel regions based on the distances to the objects and the associations of the one or more pixel regions with the objects. The exposure start time is an amount of time after an illumination event such as a flash when the light reflected from an object arrives at the image sensor. Continuing the example above, one or more of the distance sensors may determine that the car is 100 feet from the camera. As such, the time of flight for light generated by a flash at the camera is twice the one-way time-of-flight. In this example, if the time of flight is 1 ns per foot, then 200 ns after the flash, light would arrive at the camera due to the flash. Accordingly, the exposure start time would be set to 200 ns after the flash for the 1000 pixel regions associated with the car. Because a car has a complex shape and an orientation relative to the camera and image sensor, some pixel regions may have different exposure start times according to the different distances. Pixel regions associated with other objects may have different exposure start times due to their different distances from the camera.
(20) At 340, the process includes determining a corresponding exposure duration for each pixel region. The exposure duration is the amount of time after the exposure start time, which starts the capture of an image, that the image capture continues. For example, an exposure duration is the amount of time that a charge-coupled device camera sensor integrates charge due to incident light. Each pixel region continues to be exposed to incoming light starting from its corresponding exposure start time and continues for its corresponding exposure duration. The exposure duration may be different for pixel regions whose images have different light intensities. For example, pixel regions corresponding to bright areas of an image have shorter exposure durations than pixel regions corresponding to less bright areas. In this way, different pixel regions have different exposure durations according to the brightness of the associated image areas thereby preventing under exposed image areas and over exposed image areas. In some example images, at least two of the corresponding exposure durations or at least two of the corresponding exposure start times are different in the image.
(21)
(22) Implementations of the subject matter and the functional operations described in this patent document can be implemented in various systems, semiconductor devices, camera devices, digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of aspects of the subject matter described in this specification can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a tangible and non-transitory computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing unit” or “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
(23) A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
(24) The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
(25) Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random-access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of nonvolatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
(26) While this patent document contains many specifics, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this patent document in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
(27) Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Moreover, the separation of various system components in the embodiments described in this patent document should not be understood as requiring such separation in all embodiments.
(28) Only a few implementations and examples are described and other implementations, enhancements and variations can be made based on what is described and illustrated in this patent document.