CAMERA SENSOR FOR LIDAR WITH DOPPLER-SENSING PIXELS
20220011431 · 2022-01-13
Inventors
Cpc classification
G01S17/58
PHYSICS
G01S17/34
PHYSICS
International classification
Abstract
Doppler LIDARs, such as those used in ADAS (advanced driver assistance system) and autonomous vehicles, may need to sense objects at many directions. Some of the Doppler LIDAR devices use mechanically moving parts to scan over a range of directions and the various directions are not sensed simultaneously but sensed in turns over time. Mechanically moving parts generally have higher costs, less reliability and shorter Mean Time To Failure (MTTF). The camera sensor for LIDAR with Doppler-sensing pixels disclosed herein uses a Doppler sensing-chip that enables Doppler LIDAR devices to sense many directions simultaneously without having to use mechanical scan and mechanically moving parts, at least reduce the use thereof. Lower costs and higher reliability as well as higher direction sensing accuracy are objectives of this invention.
Claims
1. A camera sensor, comprising: an array of pixels; and an interface module, coupled with the pixels, for conveying sensing results outside the sensor; wherein each of the pixels comprising: a photo-detector for detecting a modulated light signal from or associated with objects being sensed, and producing a detected signal; and at least one mixer, coupled with the photo-detector and the interface module, for mixing at least one local replica signals with the detected signal or a signal derived from the detected signal, and producing at least one mixing product signals.
2. The pixels of claim Error! Reference source not found, each further includes a first filter, coupled with the photo-detector and the mixer, for attenuating interference and noise outside frequency band of a modulating signal used to modulate the modulated light signal.
3. The pixels of claim Error! Reference source not found, each further includes a second filter or filters, coupled with the mixer, for attenuating frequency components outside band of interest in the mixing product signal.
4. The at least one mixer of claim Error! Reference source not found. includes: a quadrature mixer; a plurality of mixers or quadrature mixers, each separately mixing the detected signal or a signal derived from the detected signal with one of: at least one CW components of the local replica signals; at least one FMCW components of the local replica signals; one CW component of the least one CW components of the local replica signals; one FMCW component of the least one FMCW components of the local replica signals; a square wave counterpart of one CW component of the least one CW components of the local replica signals; and a square wave counterpart of one FMCW component of the least one FMCW components of the local replica signals.
5. The pixels of claim Error! Reference source not found, each further includes a micro optical lens on top of said each pixel, for directing light exposed onto the pixel area more onto the photo-detector area of said each pixel.
6. The camera sensor of claim Error! Reference source not found, further includes at least one pre-processing functional module, coupled with the mixers of the pixels, for pre-processing the mixing product signals, and selecting, among all, some of the pixels with their addresses and mixing product signals to be exported for further processing.
7. The camera sensor of claim 6, wherein the pre-processing functional module functions as at least one of: an estimator that estimates a quantity associated with a strength of detected said modulated light signals; an estimator that estimates a quantity associated with a Doppler shift of detected said modulated light signals in its CW modulated envelop; an estimator that estimates a quantity associated with a frequency shift of detected said modulated light signals in its FMCW modulated envelop; and an estimator that estimates a quantity associated with a distance of an object associated with detected said modulated light signals.
8. The camera sensor of claim Error! Reference source not found, further includes at least one of: at least one first amplifiers, coupled with the photo-detectors and the mixers, for amplifying the detected signals; at least one second amplifiers, coupled with the mixers, for amplifying the mixing product signals; at least one filters, coupled with the mixers, for attenuating frequency components outside band of interest in the mixing product signals; at least one analog to digital converters, coupled with the mixers, for digitizing the mixing product signals; and at least one digital signal processor module, coupled with the pixels for processing the mixing product signals.
9. The camera sensor of claim Error! Reference source not found. is built on a semiconductor chip.
10. The camera sensor of claim 9 wherein the array of pixels are placed on the semiconductor chip in an area of at least one of: a rectangular shape; a round shape; a ring shape; an oval shape; an oval ring shape; and a curved belt shape.
11. The camera sensor of claim 9 wherein the array of pixels are placed on the semiconductor chip and spaced according to at least one of: Cartesian coordinates; and polar coordinates.
12. The camera sensor of claim 9 wherein the array of pixels are placed on the semiconductor chip in a plurality of zones, and wherein, in each of the zones the pixels are placed evenly according to one of a Cartesian or a polar coordinates, and densities of placement are based on the zone the pixels belong to.
13. A Doppler LIDAR receiver, comprising: a Doppler sensor having an array of Doppler sensing pixels, for producing at least one Doppler sensing signals; an optical scope, optically coupled with the Doppler sensor, for producing an optical image of objects on the Doppler sensor; a digital signal processor module, coupled with the Doppler sensor, for processing the at least one Doppler sensing signals; and whereby, the array of Doppler sensing pixels each is operable to mix a detected light signal from or associated with an object under detection and exposed onto said pixel with a local replica signal, and produce at least one of the Doppler sensing signals; and whereby, the digital signal processor module is operable, based on at least one of the Doppler sensing signals produced by a pixel and an address of said pixel in the array, to detect at least one of: a relative moving speed of said object and direction of moving in terms approaching to or leaving from said LIDAR receiver; a distance between said object and said LIDAR receiver; and a direction of said object relative to said LIDAR receiver.
14. An omnidirectional Doppler LIDAR receiver, comprising: a Doppler sensor having an array of Doppler sensing pixels, for producing at least one Doppler sensing signals; an optical scope, optically coupled with the Doppler sensor, for producing an optical image of objects on the Doppler sensor; a convex mirror, optically coupled with the optical scope, for redirecting lights from objects under detection in all directions of a hemisphere or partial hemisphere into the optical scope; a digital signal processor module, coupled with the Doppler sensor, for processing the at least one Doppler sensing signals; and whereby, the array of Doppler sensing pixels each is operable to mix a detected light signal from or associated with an object under detection and exposed onto said pixel with a local replica signal, and produce at least one of the Doppler sensing signals; and whereby, the digital signal processor module is operable, based on at least one of the Doppler sensing signals produced by a pixel and an address of said pixel in said array, to detect at least one of: a relative moving speed of said object and direction of moving in terms approaching to or leaving from said LIDAR receiver; a distance between said object and said LIDAR receiver; and a direction of said object relative to said LIDAR receiver.
15. The omnidirectional Doppler LIDAR receiver of claim 14, wherein the convex mirror is substantially a hyperbolic mirror.
16. The omnidirectional Doppler LIDAR receiver of claim Error! Reference source not found, wherein the array of Doppler sensing pixels are placed in an area shaped as one of: a round shape; and a ring shape.
17. The omnidirectional Doppler LIDAR receiver of claim 14, further includes another duplicated set of the omnidirectional Doppler LIDAR receiver on the opposite side, so that the two hemispherical omnidirectional Doppler LIDAR receivers together forms a spherical omnidirectional Doppler LIDAR receiver.
18. The omnidirectional Doppler LIDAR receiver of claim Error! Reference source not found, wherein the convex mirror and the optical scope are replaced by a fish-eye lens scope.
19. The omnidirectional Doppler LIDAR receiver of claim 18, further includes another duplicated set of the omnidirectional Doppler LIDAR receiver on the opposite side, so that the two hemispherical omnidirectional Doppler LIDAR receivers together forms a spherical Doppler LIDAR receiver.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] For a better understanding of the invention and to show more clearly how it may be carried into effect, reference will now be made, by way of example only, to the accompanying drawings which illustrate distinctive features of at least one exemplary embodiment of the invention, in which:
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
DETAILED DESCRIPTION OF THE INVENTION
[0016] It will be appreciated that in the description herein, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the invention. Furthermore, this description is not to be considered as limiting the scope of the invention, but rather as merely providing a particular preferred working embodiment thereof.
[0017] By way of example,
[0018]
[0019] The light sensing area containing Doppler sensing pixel array in sensor chip 20 does not have to use rectangular shape, in some application scenarios, shapes other than rectangular may be preferred.
[0020] For pixel array of rectangular shape on the sensor chip, individual pixels may be evenly placed according to grids of Cartesian coordinates, parallel to the edges, and address of the pixels are numbered accordingly, to represent direction of sensed objects. For pixel arrays of circular shape or ring shape, the individual pixels may be places along polar coordinates, e.g., spaced by equal polar angle and radial length to reflect equal angular step size of incoming lights from objects that form images at positions of pixels. Since in some applications, not all directions are of equal importance, multiple zones on the pixel array may be implemented with different pixel densities. Uneven spaced pixels may be implemented to correct optical deformity as well.
[0021]
[0022] In an alternative embodiment, the convex mirror(s) and the scope(s) in embodiments of
[0023] The amount of Doppler sensing data to be transferred out of the sensor chip 20 depends on 1) total number of pixels; 2) maximum bandwidth in the mixing product signals, which is proportional to the maximum Doppler shift of concern in the application, and in embodiments using FMCW modulating signal, it also depends on FM sweeping rate and maximum range in design. If the data interface is able to convey all digitized data from all pixels, then the chip may simply passing the mixing product signals through an anti-aliasing filter (not shown in drawings of
[0024] What pixels are “important”? How does the sensor chip 20 determine it? The answer is application dependent. Take the example of autonomous vehicle application in a “controlled” traffic region, in which all vehicles are equipped with beacon transmitters (e.g., the ones described in patent application U.S. Ser. No. 16/917,805), and all land structure in the region are also marked by such beacons, then the important pixels may be those with beacon signals exposed onto them that are much stronger than reflected background signals. The ones with closer distances and positive Doppler shifts (i.e., approaching objects) are most important ones since they are the objects may have higher potential risk of collisions with the vehicle in question. In application scenarios to detect reflected signals, the signal strength may not be a reliable measure as the signal strength depends not only on distance and size of objects, but also depends on surface properties of objects. In this case, a high positive Doppler as well as close distance may be good criteria for selecting important pixels to output data.
[0025] On-chip hot spot detection is a pre-selection of pixels and their neighboring ones that need to watch with higher attention, so as to output these data to off-chip DSP for further processing. For signal strength based selection, may use sum-and-dump (integrate and discharge) of absolute values/magnitude of mixing product signals at a given time interval, and pass the results to a threshold; for Doppler shift based selection, may use threshold comparator (preferably with appropriate amount of hysteresis) to detect and count number of “sign” changes in the mixing product signals from mixers that mix with CW local replicas during a given time interval to determine, or alternatively based on time needed for a given number of “sign” changes thereof to determine, and in either case, may choose to only count the “sign” changes in the direction of phase rotations for positive Doppler shifts (approaching objects). As known in the art, distance may also be determined based on frequency information from both the CW mixer outputs and FMCW mixer outputs. In the selection of important pixels, quick and potentially less accurate processing may be used, and relying on more accurate further processing on DSP 60 for final processing.
[0026] In an alternative embodiment, the DSP 60 may be implemented on the sensor chip 20 in entirety or partially, so that processing of signals created by all pixels are performed within the chip 20.
[0027] In some application scenarios, it is desirable to illuminate surroundings simultaneously using said modulated light source, so that all directions of sensing interest will be illuminated. One embodiment to achieve this is to use the apparatus shown in
[0028] Certain terms are used to refer to particular components. As one skilled in the art will appreciate, people may refer to a component by different names. It is not intended to distinguish between components that differ in name but not in function.
[0029] The terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to”. The terms “example” and “exemplary” are used simply to identify instances for illustrative purposes and should not be interpreted as limiting the scope of the invention to the stated instances.
[0030] Also, the term “couple” in any form is intended to mean either a direct or indirect connection through other devices and connections.
[0031] It should be understood that various modifications can be made to the embodiments described and illustrated herein, without departing from the invention, the scope of which is defined in the appended claims.