DETECTION AND RANGING SYSTEMS EMPLOYING OPTICAL WAVEGUIDES
20220357431 · 2022-11-10
Inventors
Cpc classification
G01S17/32
PHYSICS
G02B6/4214
PHYSICS
G01S17/42
PHYSICS
G02B6/00
PHYSICS
International classification
Abstract
An optical waveguide has at least two major external surfaces and is configured for guiding light by internal reflection, and is deployed with one of the two major external surfaces in facing relation to a scene. An optical coupling-out configuration is associated with the optical waveguide and is configured for coupling a proportion of light, guided by the optical waveguide, out of the optical waveguide toward the scene. An illumination arrangement is deployed to emit light for coupling into the optical waveguide that is collimated prior to being coupled in the optical waveguide. A detector is configured for sensing light reflected from an object located in the scene in response to illumination of the object by light coupled out of the optical waveguide by the optical coupling-out configuration. A processing subsystem is configured to process signals from the detector to derive information associated with the object.
Claims
1. A system comprising: an optical waveguide having at least two major external surfaces for guiding light by internal reflection, a first of the two major external surfaces deployed in facing relation to a scene; an optical coupling-out configuration associated with the optical waveguide configured for coupling a proportion of light, guided by the optical waveguide, out of the optical waveguide toward the scene; an illumination arrangement deployed to emit light for coupling into the optical waveguide that is collimated prior to being coupled in the optical waveguide; a detector for sensing light reflected from an object located in the scene in response to illumination of the object by light coupled out of the optical waveguide by the optical coupling-out configuration; and a processing subsystem including at least one processor, the processing subsystem being electrically associated with the detector and configured to process signals from the detector to derive information associated with the object.
2. (canceled)
3. The system of claim 1, further comprising: focusing optics for focusing the reflected light onto the detector, wherein the focusing optics is associated with a second of the two major external surfaces.
4. The system of claim 1, further comprising: focusing optics for focusing the reflected light onto the detector, wherein the reflected light is transmitted by the two major external surfaces before being received by the focusing optics.
5. The system of claim 1, further comprising: focusing optics for focusing the reflected light onto the detector, wherein an output aperture of the system is defined at least in part by the coupling-out configuration, and wherein an input aperture of the system is defined at least in part by the focusing optics.
6. The system of claim 5, wherein the input aperture is at least partially overlapping with the output aperture.
7. (canceled)
8. The system of claim 1, further comprising: a diffractive optical element associated with the first of the two major external surfaces.
9. (canceled)
10. The system of claim 1, further comprising: a scanning arrangement deployed to scan the scene with light coupled out of the optical waveguide by the optical coupling-out configuration, wherein the scanning arrangement is deployed between the illumination arrangement and the optical waveguide, and wherein the scanning arrangement is configured to deflect light emitted by the illumination arrangement to cover an angular range such that the light coupled out of the optical waveguide covers a corresponding angular range.
11. The system of claim 1, further comprising: a scanning arrangement is-associated with the first of the two major external surfaces deployed to scan the scene with light coupled out of the optical waveguide by the optical coupling-out configuration.
12. The system of claim 1, further comprising: collimating optics deployed in an optical path between the illumination arrangement and the optical waveguide for collimating light emitted by the illumination arrangement prior to coupling into the optical waveguide.
13. The system of claim 1, further comprising: an optical component deployed in an optical path between the illumination arrangement and the optical waveguide and configured to perform aperture expansion of light emitted by the illumination arrangement in at least a first dimension.
14. The system of claim 13, further comprising: a scanning arrangement associated with the first of the two major external surfaces and configured to scan a second dimension orthogonal to the first dimension.
15. The system of claim 13, wherein the optical component is configured to perform expansion of light emitted by the illumination arrangement in the first dimension and in a second dimension orthogonal to the first dimension.
16. The system of claim 13, wherein the optical component includes: a light-transmitting substrate for guiding light emitted by the illumination arrangement by internal reflection, and a second optical coupling-out configuration associated with the substrate for coupling a proportion of light, guided by the substrate, out of the substrate toward the optical waveguide.
17. The system of claim 1, wherein the optical coupling-out configuration is selected from the group consisting of: a plurality of partially reflective surfaces deployed within the optical waveguide obliquely to the two major external surfaces, and a diffractive optical element associated with at least one of the two major external surfaces.
18. (canceled)
19. The system of claim 1, further comprising: an optical coupling-in configuration associated with the optical waveguide and configured for coupling light into the optical waveguide so as to propagate within the optical waveguide by internal reflection.
20-31. (canceled)
32. The system of claim 1, wherein the optical waveguide has a trapezoidal-shape in a cross-sectional plane so as to effect lateral scanning of the scene with light coupled out of the optical waveguide.
33. The system of claim 32, further comprising: a light-transmitting substrate having two pairs of parallel major external surfaces forming a rectangular cross-section; and an optical coupling configuration associated with the substrate, wherein light that is coupled into the substrate advances by four-fold internal reflection through the substrate and a proportion of intensity of the light advancing through the substrate is coupled out of the substrate by the optical coupling configuration and into the optical waveguide.
34. The system of claim 1, wherein the optical waveguide includes two pairs of parallel major external surfaces forming a rectangular cross-section, and wherein light that is coupled into the optical waveguide advances by four-fold internal reflection through the optical waveguide.
35. The system of claim 1, further comprising: an optical coupling configuration, and wherein the optical waveguide includes a first waveguide section associated with the optical coupling configuration and a second optical waveguide section associated with the optical coupling-out configuration, and wherein light that is coupled into the optical waveguide advances through the first waveguide section by internal reflection and a proportion of intensity of the light advancing through the first waveguide section is deflected in a first direction by the optical coupling configuration so as to be coupled out of the first waveguide section and into the second waveguide section so as to advance through the second waveguide section by internal reflection, and wherein light advancing through the second waveguide section is deflected in a second direction by the optical coupling-out configuration so as to be coupled out of the optical waveguide toward the scene.
36. The system of claim 35, wherein the optical coupling configuration effectuates scanning of light in a first dimension, and wherein the optical coupling-out configuration effectuates scanning of light in a second dimension substantially orthogonal to the first dimension.
37. A light detection and ranging (LIDAR) system comprising: a transmitter comprising: an optical waveguide having at least two major external surfaces for guiding light by internal reflection, one of the major external surfaces deployed in facing relation to a scene, an optical coupling-out configuration associated with the optical waveguide configured for coupling a proportion of light, guided by the optical waveguide, out of the optical waveguide toward the scene, at least one beam source configured to emit a coherent beam of light for coupling into the optical waveguide that is collimated prior to being coupled in the optical waveguide, and a scanning arrangement deployed to scan the scene with light coupled out of the optical waveguide by the optical coupling-out configuration; a receiver comprising: a detector for sensing light reflected from an object located in the scene in response to illumination of the object by light coupled out of the optical waveguide by the optical coupling-out configuration; and a processing subsystem including at least one processor, the processing subsystem being electrically associated with the detector and configured to process signals from the detector to construct a three-dimensional representation of the object.
38-40. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0050] Some embodiments of the present invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.
[0051] Attention is now directed to the drawings, where like reference numerals or characters indicate corresponding or like components. In the drawings:
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0062] The present invention is a detection and ranging system employing an optical waveguide.
[0063] The principles and operation of the system according to present invention may be better understood with reference to the drawings accompanying the description.
[0064] Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.
[0065] Referring now to the drawings,
[0066] Generally speaking, the system 10 includes a light transmitter subsystem 100 (referred to interchangeably herein as a “transmitter subsystem” or “transmitter”) for generating and directing collimated light, represented here schematically by a beam of illumination 14, toward a scene 30 (also referred to as a “region of interest”, “field of interest” or “field of view of interest”), a light receiver subsystem 200 (referred to interchangeably herein as a “receiver subsystem” or “receiver”) for receiving light reflected or backscattered from an object 18 in the scene 30 in response to illumination from the transmitter 100, and a processing subsystem 300 associated with the transmitter subsystem 100 and the receiver subsystem 200 for controlling some of the components of the transmitter subsystem 100 and for processing signals from the receiver subsystem 200 to derive information associated with the object 18.
[0067] The scene is generally considered to be whatever scenery is in front of the transmitter 100 that can be illuminated by the transmitter 100. When the system 10 is deployed for use with a vehicle, the scene 30 is generally considered to be whatever scenery is in front of the vehicle that can be illuminated by the transmitter 100. In the context of vehicle deployment, objects in the scene that can be detected and imaged by the system 100 include, for example, other vehicles, pedestrians, cyclists, trees, rocks, street signs, street lights, or any other solid body or obstruction in the path of the vehicle.
[0068] The beam 14 is scanned vertically and horizontally (laterally) across the field of interest by a scanning arrangement of the transmitter subsystem 100. The scanning beam 14 is designated by the double-headed arrow labeled as 16 in
[0069] Referring now to
[0070] The illumination and beam combining unit 102 includes an illumination arrangement 104 deployed to emit beams of light for coupling into the optical waveguide 120. The illumination arrangement 104 includes at least one beam source, preferably at least two beam sources, and more preferably at least three beam sources. The beam sources (referred to interchangeably herein as “light sources”, “illumination sources” or “sources of light”) are preferably implemented as a set (i.e., plurality) of laser sources, such as, for example, laser diodes, fiber lasers, or microchip lasers, each configured to generate (i.e., produce) a corresponding coherent beam of laser illumination.
[0071] In certain non-limiting implementations, laser sources are deployed side by side so as to emit separate beams of laser light in a common direction that form a combined beam. In other non-limiting implementations, the illumination arrangement 104 further includes a beam combiner (not shown), and laser sources are deployed at various positions relative to the beam combiner so as to combine the beams from the individual beam sources as a combined beam. Beam combiners are well-known in the art and can be implemented in various ways, using, for example, beamsplitter arrangements, dichroic mirrors, prisms, and the like.
[0072] In certain non-limiting embodiments, one of the beam sources is implemented as a visible light laser source configured to produce laser light in the visible region of the electromagnetic spectrum, and the remaining beam sources are implemented as NIR laser sources configured to produce laser light at different respective wavelengths in the NIR region of the electromagnetic spectrum. In one set of preferable but non-limiting implementations, the beam sources are implemented as a set of two or three modulated NIR laser sources and a visible light laser source placed side by side or combined via a beam combiner. The visible light laser source can be modulated for range detection or modulated so as not to transmit simultaneously during the MR laser transmission. Alternatively, the visible light laser can be configured to operate in a continuous wave (CW) mode. The visible light laser source is preferably configured to produce light having wavelength corresponding to a color that is easily discernible by the human eye, for example a wavelength in the range of 420-680 nm. In embodiments in which the NIR laser sources produce light at different respective wavelengths, three MR laser sources emitting light at around 940 nm (e.g., 935 nm, 940 nm, and 935 nm, respectively), in combination with a visible light laser source has been found to particularly suitable for the LIDAR applications. It is noted that a significantly high proportion of solar radiation intensity at wavelengths around 940 nm is typically absorbed by the atmosphere, and therefore sunlight illumination around 940 nm tends not to impinge on the optical sensor, or to impinge on the optical sensor at a relatively low intensity compared to the intensity of light that is to be detected by the optical sensor. It is also noted that all of the beam sources may emit beams at the same wavelength (e.g., all at 940 nm). Furthermore, although a visible light laser can be used in combination with NIR lasers for eye-safety purposes, it is also possible to utilize eye-safe lasers that are outside of the NIR and visible regions. For example, lasers at the lower end of the short-wavelength infrared (SWIR) region, in particular around 1550 nm, are more eye-safe than lasers in the NIR region.
[0073] The use of beam sources that emit light at different respective wavelengths enables the receiver 200 to detect a wide variety of materials since certain types of materials may have a greater spectral response to certain wavelengths than to other wavelengths. For example, plants typically exhibit higher reflection of light at wavelengths around 700 nm. The variation in spectral response may also enable mapping of the scene, by the processing subsystem 300, by identifying wavelength-dependent changes in the intensity of signal generated by the detector 204.
[0074] The illumination arrangement 104, in addition to having the beam sources, and in certain instances a beam combiner, may also include various components that can be used to modify beam parameters of the beams produced by the beam sources. Such components include, but are not limited to, modulators for modulating beam intensity and/or phase and/or frequency, and amplifiers for amplifying the intensity signal of the generated beams. In certain non-limiting implementations, each beam source is associated with a modulator and an amplifier. In other implementations, only some of the beam sources are associated with a modulator and/or an amplifier.
[0075] Transmission timing of the beam sources, as well as modulation and/or amplification of the beams generated by the beam sources, is preferably controlled by the processing subsystem 300. In certain embodiments, the beams produced by the beam sources are coherently combined, and each beam source has an associated phase modulator that allows adjustment of the relative phase offsets between the beams so as to maintain phase coherence of the beams. In such embodiments, the processing subsystem 300 measures the relative phase offsets between the beams and actuates the phase modulators to adjust the phase offsets.
[0076] The light emitted by the illumination arrangement 104 may be unpolarized, or may be polarized. To produce polarized light, the illumination arrangement 104 may include a linear polarizer deployed at the output of the beam sources or at the output of the beam combiner such that the combined beam passes through the linear polarizer. In cases where the beam sources themselves are polarized sources, such a linear polarizer is not needed.
[0077] The combined beam from the beam sources, represented schematically by the thick arrow and generally designated 108, is scanned by a scanning arrangement 106. The scanning arrangement 106 preferably includes optical components that divert (i.e., deflect) incoming beams, as well as electro-mechanical components (e.g., electro-mechanical actuators) for adjusting the position and/or orientation of the optical components to effectuate divergence of the beams in a desired direction. The scanning arrangement 106 can be implemented as any suitable beam diverging or beam steering mechanism, including, for example, a single scanning or tilting mirror that performs scanning in two orthogonal dimensions (e.g., vertical and horizontal/lateral), a pair of orthogonal single axis scanning or tilting mirrors, and a set of prisms with one or more of the prisms being rotatable/tiltable about one or more axis of rotation/tilt. Preferably, the scanning arrangement 106 is electrically associated with the processing subsystem 300, which controls the scanning action of the scanning arrangement 106.
[0078] Collimating optics 110 is deployed in the optical path between the scanning arrangement 106 and the optical waveguide 120. The collimating optics 110 includes at least one optical component that collimates the scanned beam 108 onto the output aperture (i.e., exit pupil) of the illumination and beam combining unit 102. In the illustrated embodiment, the collimating optics 110 includes a pair of collimating optical elements, represented schematically as lenses 112, 114, which form an intermediate image plane 116 between the lenses 112, 114. In certain non-limiting implementations, a micro-lens array (MLA) or a diffuser is deployed at the image plane 116 to fit the exit pupil of the illumination and beam combining unit 102 to the entrance pupil (i.e., input aperture) of the optical waveguide 120. This aperture fitting by the MLA or diffuser spreads the intensity of the beam 108 across the input aperture of the optical waveguide 120, thereby reducing the overall intensity of the beam 108 that is to be coupled into the optical waveguide 120. The reduced intensity of the beam 108 further increases eye-safety, and therefore preferred implementations employ the MLA or diffuser for aperture fitting. The collimating optics 110 also generates pupil imaging between the plane of the scanning arrangement 106 and the exit pupil plane of the illumination and beam combining unit 102 (adjacent to the optical coupling-in configuration 118) such that all of the scanned beams are transmitted through the exit pupil of the illumination and beam combining unit 102 and enter the optical waveguide 120. It is noted that the illumination arrangement 104 may itself have a small exit pupil, and therefore using an MLA may not be necessary unless a uniform output beam is needed. It is also noted that in certain embodiments the illumination arrangement 104 may include collimating optics, such that the combined beam 108 from the beam sources is a collimated beam. For example, certain beam combiners employ embedded collimating optics such that the individual beams, in addition to being combined by the beam combiner, are also collimated by the beam combiner. In such embodiments, collimating optics 110 may not be necessary, or may be used to re-collimate the beam 108 if the beam becomes de-collimated due to the scanning by the scanning arrangement 106.
[0079] The scanned and collimated beam from the illumination and beam combining unit 102 is coupled into the optical waveguide 120 by the optical coupling-in configuration 118, represented here schematically as a suitably angled coupling prism. Other suitable optical coupling-in configurations for coupling illumination into the optical waveguide 120, such as by use of a coupling-in reflector or a diffractive optical element, are well-known in the art. The coupled in beam propagates (i.e., is guided) through the optical waveguide 120 by repeated internal reflection at the faces 122, 124. The propagating beam is represented schematically by the thick arrow and generally designated 128. In certain preferred but non-limiting implementations, the propagation through the optical waveguide 120 by internal reflection is in the form of total internal reflection (TIR), whereby incidence of the illumination (beam 128) at the faces 122, 124 at angles greater than a critical angle causes reflection of the illumination at the faces 122, 124. As is well-known in the art, the critical angle is defined by the refractive index of the material from which the optical waveguide 120 is constructed and refractive index of the medium in which the optical waveguide 120 is deployed (e.g., air). In other non-limiting implementations, the propagation through the optical waveguide 120 by internal reflection is effectuated by a reflective coating (e.g., an angularly selective reflective coating) applied to the faces 122, 124.
[0080] The beam 128 propagates within the optical waveguide 120 and impinges on an optical coupling-out configuration associated with the optical waveguide 120, which in the illustrated embodiment is implemented as a sequence of parallel partially reflective surfaces 126 deployed within the optical waveguide 120 at an oblique angle to the faces 122, 124, where part of the intensity of the beam 128 is reflected so as to be coupled out of the optical waveguide 120 towards a scene (e.g., the scene 30 in
[0081] It is noted that the partially reflective surfaces 126 are merely illustrative of one non-limiting optical coupling-out configuration suitable for use with the optical waveguide 120, and other optical coupling configurations can be used to couple illumination out of the optical waveguide 120. The optical coupling-out configuration may be any optical coupling arrangement which deflects part of the illumination propagating within the optical waveguide 120 by internal reflection to an angle such that the deflected part of the illumination exits the optical waveguide 120. Other examples of such suitable optical coupling arrangements include, but are not limited to, one or more diffractive optical elements deployed on either of the faces 122, 124.
[0082] In the non-limiting implementation illustrated in
[0083] The effect of the optical waveguide 120 and the optical coupling-out configuration on the beam 108 from the illumination and beam combining unit 102 is that the output aperture (exit pupil) of the illumination and beam combining unit 102 is multiplied (i.e., expanded) as the beam 128 propagates within the optical waveguide 120 and is coupled out of the optical waveguide 120. This aperture expansion (aperture multiplication) can be in one dimension (as is the case in the non-limiting implementation of the optical waveguide 120 in
[0084] Details of optical waveguides used in near-eye displays that perform one-dimensional aperture expansion of image illumination generated by image projectors having small output aperture for coupling-out to an eye of an observer can be found in various commonly owned issued patents, including the following which are hereby incorporated by reference in their entireties herein: U.S. Pat. Nos. 6,829,095, 7,577,326, 7,724,444, 7,751,122, 9,551,880, and 9,025,253. Details of optical waveguides used in near-eye displays that perform two-dimensional aperture expansion of image illumination generated by image projectors having small output aperture for coupling-out to an eye of an observer can be found in various commonly owned issued patents, including the following which are hereby incorporated by reference in their entireties herein: U.S. Pat. Nos. 10,133,070 and 10,551,544.
[0085] It is noted that although the faces 122, 124 are preferably mutually parallel, the requirement for parallelism is less strict for optical waveguides that are used in non-display applications, such as the optical waveguide 120 of the present embodiments in which the optical waveguide is used to illuminate a scene with laser illumination covering a desired angular range. This is in contrast to the optical waveguides in the commonly owned patents mentioned above, where any deviation of parallelism between pairs of major external surfaces will cause the image illumination that propagates through the waveguide to form non-conjugate image sets, resulting in degraded quality of the image that is coupled out of the optical waveguide to the eye of the observer.
[0086] It is noted that in many LIDAR system configurations, referred to as “common aperture” configurations, the receiver unit is located at the same aperture as the emitter unit. Benefits of systems that use common aperture configurations include lack of parallax effects that perturb the LIDAR system, and a more compact system. The non-limiting embodiment of the system 10 illustrated in
[0087] The focusing optics 202, represented schematically as a lens (but which may include a set of lenses), is deployed in an optical path between the scene and the photodetector 204. The focusing optics 202 receives the light 22A, 22B, 22C from the scene (i.e., reflected by the illuminated object in the scene) and converts the received light 22A, 22B, 22C into converging beams of light (represented schematically as light rays 23A, 23B, 23C) that impinge on the detector 204. In certain implementations, the focusing optics 202 forms an image of the object on the detector 204. The focusing optics 202 is preferably deployed to define a field of view corresponding to the region or portion of the scene that is illuminated by the transmitter 100 so as to enable the capture of light reflected from objects in the illuminated scene. In certain embodiments, a passband spectral filter may be deployed in the optical path from the scene to the detector 204 to obstruct light of wavelengths outside a given range of wavelengths within which the illumination from the illumination arrangement 104 is generated from reaching the detector 204. The spectral filter may ideally be positioned between the focusing optics 202 and the detector 204, but may alternatively be deployed between the face 124 and the focusing optics 202.
[0088] The external surfaces (i.e., faces 122, 124) of the optical waveguide 120 are preferably coated with an anti-reflection coating in order to prevent the optical waveguide 124 from scattering light that is emitted by the transmitter 100 back to the receiver 200.
[0089] In embodiments where the illumination arrangement 104 emits polarized light, the partially reflective surfaces are preferably polarization sensitive, whereby the proportion of the intensity of polarized light that is reflected by the partially reflective surfaces depends on the polarization direction of the propagating beam. In embodiments in which the transmitted beams 130A, 130B, 130C are polarized, a polarizer (not shown) is preferably deployed in the optical path between the receiver 200 and the optical waveguide 120 (for example, in association with the face 124) so as to substantially suppress saturation of the receiver 200. Note that such suppression may come at the expense of 50% transmittance of the light 22 from the scene,
[0090] With continued reference to
[0091] In addition to having non-overlapping apertures, the embodiment illustrated in
[0092] It is noted that the receiver 200 can be deployed relative to the transmitter 100 such that a portion of the focusing optics 202 is associated with the face 124 (i.e., a portion of the focusing optics 202 is located behind the optical waveguide 120) and the remaining portions of the focusing optics 202 is positioned adjacent to the optical waveguide 120. In such a deployment, the input aperture of the receiver 200 defined by the focusing optics 202 is partially overlapping with the output aperture of the transmitter 100.
[0093] In the non-limiting embodiments illustrated in
[0094] One method of increasing the angular range of the output beams is illustrated in
[0095] In the illustrated embodiment, the beam sources operate at different respective wavelengths (i.e., the light emitted by each beam source has a different respective wavelength) and the combined beam 128 does not disperse as it propagates through the optical waveguide 120. When the coupled-out beams 130A, 130B, 130C pass through the diffractive optical element 140, the beams 130A, 130B, 130C are dispersed by the diffractive optical element 140 to generate corresponding dispersed beams, represented schematically by the thick dashed arrows and generally designated 136A, 136B, 136C, thereby increasing the angular range covered by the beams 130A, 130B, 130C, 136A, 136B, 136C. When using a common aperture configuration, as illustrated in
[0096] As mentioned above, the optical waveguide 120 can be implemented in various ways to enable expansion of the input aperture in one dimension or two dimensions. The following paragraphs describe various implementation options of the optical waveguide 120 so as to enable aperture expansion and scanning of the scene by the coupled-out beams.
[0097] With continued reference to
[0098]
[0099]
[0100] The optical coupling between the optical waveguides 220, 320, the deployment and configuration of the partially reflective surfaces 226, 326, and the deployment of the coupling-in configuration (not illustrated here) and the illumination and beam combining unit 102 are such that, when the output beam from the illumination and beam combining unit 102 (e.g., the beam 108 in
[0101] As should be apparent the two-dimensional aperture expansion performed by the optical waveguides described with reference to
[0102] Further details of the structure and operation of optical waveguides that are similar in structure to the optical waveguides described with reference to
[0103]
[0104] A first set of partially reflective surfaces 426a is deployed in the first section 421 of the optical waveguide 420 oblique to the planar face 424 and the plane 425, and a second set of partially reflective surfaces 426b is deployed in the second section 423 of the optical waveguide 420 oblique to the face 432. In addition, the planes containing the partially reflective surfaces 426a are oblique or perpendicular to the planes containing the partially reflective surfaces 426b.
[0105] The deployment and configuration of the partially reflective surfaces 426a, 426b, and the deployment of the coupling-in configuration (not illustrated here) and the illumination and beam combining unit 102 are such that, when the output beam from the illumination and beam combining unit 102 (e.g., the beam 108 in
[0106] Further details of the structure and operation of optical waveguides employing differently oriented sets of partially reflective surfaces for redirecting propagating illumination from one guided direction to another guided direction and coupling the illumination out of the optical waveguide can be found in the above-mentioned U.S. Pat. No. 10,551,544.
[0107] Although the embodiments of the LIDAR system described thus far have pertained to a transmitter subsystem employing a scanning arrangement as part of the illumination and beam combining unit, other embodiments are possible, in which an external scanning arrangement is deployed at the output of the optical waveguide. Referring now to
[0108] In certain embodiments, the scanning arrangement 160 is configured to perform two-dimensional scanning, whereas in other embodiments the scanning arrangement 160 is configured to perform one-dimensional scanning. In embodiments in which the scanning arrangement 160 performs two-dimensional scanning, the collimating optics 110 collimate the beam 108 transmitted by the illumination arrangement 104 (optionally with pupil imaging, as indicated by the image plane 116), and the collimated beam 108 is coupled into the optical waveguide 120 for aperture multiplication via propagation by internal reflection and coupling-out by the partially reflective surfaces 126. Here, the coupled-out beams 130A, 130B, 130C illuminate a single direction, and the scanning arrangement 160 deflects the beams 130A, 130B, 130C vertically and laterally so as to perform vertical and lateral scanning, thereby two-dimensionally scanning the entire field of interest. The deflected beams that are generated by the scanning arrangement 160 from the beams 130A, 130B, 130C are represented schematically in
[0109] In embodiments in which the scanning arrangement 160 performs one-dimensional scanning, for example vertical scanning, the illumination and beam combining unit 102 further includes an optical component 170 deployed at the output of the illumination arrangement 104 and upstream from the collimating optics 110, that is configured as a one-dimensional beam expander so as to generate a line of illumination in the far field and on the image plane 116. In one non-limiting implementation according to such embodiments, the optical component 170 is implemented as a one-dimensional scanning arrangement (similar to 106 in
[0110] According to a non-limiting implementation according to other embodiments, the optical component 170 is implemented as a two-dimensional beam expander that illuminates a rectangular field. One example of such a two-dimensional beam expander is the optical waveguide with embedded partially reflective surfaces described with reference to
[0111] The scanning arrangement 160 can be implemented as any suitable beam diverging or beam steering mechanism, including, but not limited to, a single scanning or tilting mirror that performs scanning in two orthogonal dimensions, a pair of orthogonal single axis scanning or tilting mirrors, a set of prisms with one or more of the prisms being rotatable/tiltable about one or more axis of rotation/tilt. Preferably, the scanning arrangement 160 is electrically associated with the processing subsystem 300, which controls the scanning action of the scanning arrangement 160.
[0112] The following paragraphs describe the processing subsystem 300, and in particular the components of the processing subsystem 300 as well as the processing and control functionality provided by the processing subsystem 300. Generally speaking, the processing subsystem 300 is electrically associated with components of the transmitter 100 and the receiver 200 so as to provide both processing and control functionality to the subsystems of the LIDAR system 10. In particular, the processing subsystem 300 is electrically associated with the detector 204 and is configured to process signals from the detector 204 to derive information associated with the illuminated objects in the scene (e.g., the object 18 in
[0113] The processing subsystem 300 is preferably further configured to synchronize the detector 204 with the illumination timing of the beam sources to integrate light during integration periods corresponding to the periods of illumination by the illumination arrangement 104. In addition, the processing subsystem 300 is electrically associated with the scanning arrangements of the various embodiments, e.g., the scanning arrangement 106 (
[0114] The processing system 300 may be implemented using any suitable type of processing hardware and/or software, as is known in the art, including but not limited to any combination of various dedicated computerized processors operating under any suitable operating system and implementing suitable software and/or firmware modules. The processing system 300 may further include various communications components for allowing wired or wireless communication with LAN and/or WAN devices for bidirectional transfer of information. A simplified block diagram of the processing subsystem 300 according to a non-limiting example implementation is illustrated in
[0115] It is noted that in addition to the processor 302 and storage medium 304, the processing subsystem 300 may include additional electronic circuitry for receiving and/or processing analog and/or digital signals, including, for example, demodulation circuitry, frequency synthesizers, frequency mixers, band-pass filters, low-pass filters, amplifiers (e.g., low-noise amplifiers), analog-to-digital converters (for example in the form of sampling and quantization circuits), digital-to-analog converters, local oscillators, and the like. It is also noted that in certain embodiments, the processing subsystem 300 itself can be integrated as part of the receiver 200. In other embodiments, subcomponents of the processing subsystem 300 can be integrated as part of the receiver 200, while other components of the processing subsystem 300 can be stand-alone components that are separate from the receiver 200,
[0116] The optical waveguide configurations and scanning arrangements of the embodiments of LIDAR systems described above with reference to
[0117] As is well-known to those of skill in the art, the measurement principle used for generating 3D representations of objects in LIDAR systems is time-of-flight (TOF), where the beams generated by the transmitter of the LIDAR system (e.g., transmitter 100) are projected (via beam scanning) onto an object in a scene and the reflected illumination is detected (e.g., by the detector 204) and processed (e.g., by the processing subsystem 300) to determine the distance (i.e., range) to the object allowing the creation of a 3D point cloud. The distance to the object, typically the distance from the object to the detector 204, is measured based on the round-trip delay of light waves that travel to the object. The distance measurement can be achieved by modulating the intensity, phase, and/or frequency of the transmitted laser illumination and measuring the time required for the modulation pattern to appear at the receiver.
[0118] One approach to TOF measurement is based on intensity modulation of short pulses of laser illumination. Here, short pulses of laser illumination are directed towards the scene, and the distance to the object in the scene is determined by multiplying the speed of light by the time the pulse takes to travel the distance to the object. As mentioned, the processing subsystem 300 preferably provides synchronization between the illumination arrangement 104 and the detector 204, thereby providing synchronization between pulse timing of the beam sources and integration periods of the detector 204. For TOF measurement, the processing subsystem 300 actuates a timer circuit (the timer circuit may be part of the processing subsystem 300) to initialize a timer upon transmission of each laser pulse, and actuates the timer circuit to terminate the timer upon receipt of an output signal from the detector 204. The detector 204 generates the output signal in response to capturing the illumination reflected from the object, where the output signal is indicative of the intensity of light captured by the detector 204. The TOF is measured as the elapsed time between the timer initialization and timer termination. Since the TOF is obviously representative of the twice the distance to the object (i.e., the distance from the transmitter to the object plus the distance from the object to the detector), the TOF should be halved in order to provide the actual range to the object. Therefore, using the simple intensity modulation approach, the distance to the object, D, can be expressed as:
where c is the speed of light (approximated as 3×10.sup.8 m/s).
[0119] Another approach to TOF measurement is based on amplitude modulation of a continuous wave (referred to as AMCW), whereby the phase of the transmitted illumination is compared with the phase of the detected reflected illumination. Here, the optical power of the transmitted CW laser signal is modulated with a constant frequency f.sub.M, typically a few hundred KHz, so the intensity signal of the transmitted beam is a sinusoidal or square wave of frequency f.sub.M. The detector 204 captures the reflected illumination from the object and generates an output signal indicative of the intensity of the captured illumination. The distance measurement, D, is derived based on the phase shift, ΔΦ, that occurs between the transmitted intensity signal and the reflected intensity signal, as well as the modulation frequency f.sub.M, and can be expressed as follows:
where again, c is the speed of light.
[0120] Techniques for demodulating the generated intensity signal and extracting phase information are well-known in the art, but several brief examples are provided herein. In one non-limiting example, phase measurement can be obtained using an arrangement of mixers and low-pass filters, or by sampling the generated intensity signal and cross-correlating the sampled signal with the transmitted phase signal that is shifted by a number of fixed phase offsets. Another approach involves sampling the generated intensity signal and mixing it with the transmitted phase signal that is shifted by a number of fixed phase offsets, and then sampling the mixed signal at the resultant number of phases. The various techniques mentioned here utilize various electronic components, including, for example, mixers, filters, local oscillators, analog-to-digital converters, digital-to-analog converters, and the like, and can be implemented as electronic circuitry that can be wholly part of the receiver 200, wholly part of the processing subsystem 300, or shared between the processing subsystem 300 and the receiver 200.
[0121] Another approach to TOF measurement is based on frequency modulation of a continuous wave (referred to as FMCW), whereby the instantaneous optical frequency of the transmitted intensity signal is periodically shifted, typically by varying the output power of the beam sources. As in the AMCW approach, the detector 204 captures the reflected illumination from the object and generates an output signal indicative of the intensity of the captured illumination. However, here the signal generated by the detector 204 is mixed with the transmitted source signal to create a beat frequency that can be used to measure the object distance. For a static object, the time delay between the transmission of the laser illumination and the collection of illumination by the detector 204 causes a constant frequency difference (i.e., beat frequency) f.sub.B from the mixing of the signals. By linearly varying the instantaneous optical frequency of the transmitted laser illumination over a period T, the beat frequency f.sub.B varies in direct proportion to TOF, and by equivalence is therefore proportional to the distance D to the object. The proportional relationship between f.sub.B and TOF can be expressed as follows:
and therefore, D can be expressed as:
where B is the bandwidth of frequency sweep.
[0122] The frequency difference between the transmitted and received signal is manifested as a periodic phase difference, which causes an alternating constructive and destructive interference pattern at the beat frequency f.sub.B, thereby by producing a beat signal at frequency f.sub.B. The peak of the beat frequency f.sub.B can be easily translated into distance by analyzing the beat signal in the frequency domain by way of Fourier analysis. One particular preferred technique for performing the frequency domain analysis is by Fast Fourier Transform (FFT). FFT algorithms are well-known in the art, and can be implemented using the processing subsystem 300.
[0123] In the example described above, the instantaneous frequency is varied linearly and monotonically increases so as to produce a ramp modulation frequency. However, in many practical applications of FMCW, a triangular modulation frequency is used instead of a ramp. Here, the rate of frequency change is expressed as 2f.sub.MB, where f.sub.M is the modulation frequency. Hence, the beat frequency f.sub.B can be expressed as follows:
[0124] Here too the beat signal can be analyzed by applying FFT algorithms to translate the peak of the beat frequency f.sub.B into distance. This triangular modulation is of particular value when used to detect moving objects, where the velocity (i.e., speed and direction) of the object can be determined by calculating the Doppler frequency.
[0125] It is noted that all of the above techniques for determining TOF and distance to the object have been described within the context of pointwise measurements, in which a single pulse or single modulated beam of laser illumination is transmitted by the transmitter 100 so as to illuminate a point on an object in the scene, and whereby the receiver 200 (in particular the detector 204) captures light reflected from the object in response to the aforementioned illumination, and the processing subsystem 300 derives the TOF and distance information based on the signals generated by the detector 204 in response to capturing light. However, as is well-known in the art, one of the key outputs generated by LIDAR systems are 3D representations of the illuminated objects, which are typically in the form of 3D point clouds or 3D images rendered therefrom. Such point clouds are typically generated by scanning the field of view so as to illuminate a large number of points of the object, and responsively calculating TOF and distance from the captured backscattered (reflected) light for each illuminated point. According to preferred embodiments of the present invention, the processing subsystem 300 is configured to generate such a 3D representation, for example, a point cloud, by scanning the field of view (using the techniques enabled by the optical waveguide and scanning arrangement configurations described with reference to
[0126] Generally speaking, the density of the point cloud is limited by the scanning rate (i.e., how fast different regions within the scene are illuminated) and the capture rate of the detector 204. When the detector 204 is implemented as a sensor matrix or rectangular pixel array, the detector 204 can capture reflected light from multiple regions simultaneously, thereby providing a higher overall capture rate. Preferably, the transmitter 100 and the receiver 200 are configured so as to enable the processing subsystem 300 to produce a “high-density” point cloud that resembles a 3D image. The processing subsystem 300 may also be configured to convert the 3D point cloud into to a two-dimensional (2D) depth image using techniques that are well-known in the art.
[0127] Although the embodiments of the present invention described thus far have pertained to using an illumination arrangement having beam sources configured to produce light having wavelengths in the NIR region of the electromagnetic spectrum and/or the visible region of the electromagnetic spectrum, other embodiments are possible in which the illumination arrangement includes one or more beam sources configured to produce light outside of the NIR and visible regions, including, for example, beam sources configured to produce light in the ultraviolet region of the electromagnetic spectrum.
[0128] The operational range that is achievable by the system according to the embodiments described herein is typically a function of several parameters, including, for example, beam wavelength, beam intensity, pulse duration, and beam divergence. Some of these parameters may be adjusted by controlled input to the illumination arrangement 104 from the processing subsystem 300, while other parameters can be adjusted by modifying the various optical and scanning components of the illumination and beam combining unit 102, while yet other parameter can be adjusted by changing the type of beam source(s) deployed in the illumination arrangement 104. Those of ordinary skill in the art will understand how to tune the various parameters in order to achieve the desired operational range. By tuning some of these parameters, the system according to the embodiments described herein can achieve a superior operational range to conventional LIDAR system. Neglecting atmospheric attenuation, beam divergence or other degradation factors, conventional LIDAR systems that employ NIR lasers that operate at around 900 nm have a maximum operational range of approximately 100 meters. In a non-limiting example, assuming a predefined intensity (for eye-safety) at the input aperture of the optical waveguide 120, and assuming that the optical waveguide 120 provides three-fold expansion of the aperture (in two dimensions), the total output power at the output aperture of the of the transmitter 100 is increased by a factor of 9, and therefore the operational range of the system 10 is increased by a factor of 3 (in accordance with the inverse-square law) as compared to conventional LIDAR systems. Therefore, the operational range achievable by the LIDAR systems of the present invention can be expected to be at least 300 meters.
[0129] It is noted that when the LIDAR system according to embodiments of the present invention is deployed in a driver-operated ground-based vehicle or deployed for use with a driver-operated ground-based vehicle, the optical waveguides of the disclosed embodiments may be advantageously installed in front of the driver of the vehicle, for example integrated into the dashboard or front windshield of the vehicle. When the LIDAR system is deployed as part of a helmet, the optical waveguides of the disclosed embodiments may be advantageously installed as part of the helmet in a front region of the helmet.
[0130] Although the embodiments of the LIDAR system disclosed thus far have been described within the context of LIDAR applications for use with ground-based vehicles such as autonomous or semi-autonomous vehicles, the embodiments of the present invention can also be used to advantage in stationary terrestrial LIDAR applications, and airborne LIDAR applications such as remote sensing applications. For terrestrial applications, embodiments are contemplated herein in which the system is deployed on a stationary platform, such as a mount or tower, in order to collect data associated with objects in the scene. For airborne applications, embodiments are contemplated in which the system is deployed in, or mounted to, an aircraft, such as a manned (i.e., human-piloted) aircraft (e.g., an airplane, a helicopter, etc.) or an unmanned aircraft (e.g., unmanned aerial vehicle (UAV), drone, etc.). In such embodiments, the system is preferably deployed at the underside or belly of the aircraft, thereby enabling the system to collect data associated with objects in a remote scene on the ground that is monitored by aircraft (typically travelling at an altitude in the range of 10-100 meters, or up to 1 kilometer when employing high-intensity laser sources deployed on small UAVs or drones)
[0131] It is noted that although the transmitter and receiver of the embodiments disclosed thus far have been described within the specific context of use in LIDAR applications, in particular LIDAR systems deployed for use with ground-based or airborne vehicles, transmitter and receiver configurations based on the above-described embodiments may be suitable for use in non-LIDAR applications in which scene scanning is not necessary, such as in laser rangefinder applications. For example, transmitter configurations without the scanning arrangements of the above-described embodiments can be used to advantage as part of ground-mounted or hand-held laser rangefinder systems, where a single point or small cluster of points in a scene is illuminated without scanning in order to measure the distance to the point or point-cluster.
[0132] The descriptions of the various embodiments of the present disclosure have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
[0133] As used herein, the singular form, “a”, “an” and “the” include plural references unless the context clearly dictates otherwise.
[0134] The word “exemplary” is used herein to mean “serving as an example, instance or illustration”. Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
[0135] It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
[0136] To the extent that the appended claims have been drafted without multiple dependencies, this has been done only to accommodate formal requirements in jurisdictions which do not allow such multiple dependencies. It should be noted that all possible combinations of features which would be implied by rendering the claims multiply dependent are explicitly envisaged and should be considered part of the invention.
[0137] Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.