LASER RADAR
20220082697 · 2022-03-17
Assignee
Inventors
- Anthony SLOTWINSKI (Nellysford, VA, US)
- Ghassan CHAMSINE (Haymarket, VA, US)
- Tom HEDGES (Haymarket, VA, US)
- Paul LIGHTOWLER (Birmingham, GB)
- Daniel Gene Smith (Tucson, AZ, US)
- Yasuhiro HIDAKA (Fujisawa-shi, JP)
- Masanori ARAI (Kamakura-shi, JP)
Cpc classification
G02B27/288
PHYSICS
G02B27/148
PHYSICS
G02B27/286
PHYSICS
G01S7/481
PHYSICS
G01S17/66
PHYSICS
G01S17/86
PHYSICS
G01S17/894
PHYSICS
G01S17/42
PHYSICS
International classification
G01S7/481
PHYSICS
Abstract
Laser radar include two part objective lenses that are used for imaging a target and directing probe beams to the target. Portions of tracer beams that degrade target images are attenuated with dichroic filters that block central portions of tracer beams. Local oscillator beams can be produced with a mixing lens that directs probe beams through a polarizing beam splitter to focus at or on a waveplate so that portions are reflected from the waveplate as local oscillator beams. An imaging system that is confocal with a probe beam is coupled to provide target measurements or to establish a focus of the probe beam.
Claims
1. An apparatus, comprising: a dichroic beam splitter; an optical fiber situated to direct a probe beam and a tracer beam to the dichroic beam splitter along an axis; an objective lens situated on the axis and comprising a fixed lens and a movable lens, the movable lens situated to receive the probe beam from the dichroic beam splitter and direct the probe beam to a target along the axis; and an image sensor optically coupled to the dichroic beam splitter and situated on the axis to receive an imaging beam from the target via the dichroic beam splitter, wherein the movable lens is translatable to form a target image at the image sensor and focus the probe beam at the target.
2. The apparatus according to claim 1, wherein the dichroic beam splitter is situated so that the probe beam is transmitted through the dichroic beam splitter to the movable lens and the imaging beam is reflected by the dichroic beam splitter to the image sensor.
3. The apparatus according to claim 1, wherein the objective lens is situated to receive a tracer beam from the dichroic beam splitter and direct the probe beam and the tracer beam to the target, wherein the probe beam has a wavelength between 1200 nm and 1800 nm and the tracer beam has a wavelength between 400 nm and 700 nm.
4. The apparatus according to claim 1, wherein the dichroic beam splitter is situated so that the probe beam is reflected by the dichroic beam splitter to the movable lens and the imaging beam is transmitted by the dichroic beam splitter to the image sensor.
5. The apparatus according to claim 1, wherein the dichroic beam splitter is a cube dichroic beam splitter, a plate dichroic beam splitter, or a double-reflecting dichroic beam splitter.
6. The apparatus according to claim 1, wherein the dichroic beam splitter is a double-reflecting dichroic beam splitter that includes a first surface facing the movable lens and a dichroic reflecting surface situated to direct the imaging beam to the image sensor and the portion of the probe beam returned from the target to the optical fiber.
7. The apparatus according to claim 1, wherein the dichroic beam splitter is a double-reflecting dichroic beam splitter that includes a first surface facing the movable lens and a dichroic reflecting surface situated to direct the imaging beam to the first surface so that the imaging beam is reflected to the image sensor by the first surface, and the portion of the probe beam returned from the target to the optical fiber is transmitted by the reflecting surface to the optical fiber.
8. The apparatus according to claim 1, wherein the dichroic beam splitter is a double-reflecting dichroic beam splitter that includes a first surface facing the movable lens and a dichroic reflecting surface situated to direct the portion of the probe beam returned from the target to the first surface, and the imaging beam is transmitted by the dichroic reflecting surface to the image sensor.
9. The apparatus according to claim 6, wherein the first surface is situated at an angle greater than a critical angle with respect to the imaging beam received from the dichroic reflecting surface.
10. The apparatus according to claim 7, wherein the first surface is situated at an angle greater than a critical angle with respect to the imaging beam received from the dichroic reflecting surface.
11. The apparatus according to claim 8, wherein the first surface is situated at an angle greater than a critical angle with respect to the imaging beam received from the dichroic reflecting surface.
12. The apparatus according to claim 6, wherein the double-reflecting dichroic beam splitter includes an output surface situated such the portion of the probe beam returned from the target and reflected by the dichroic reflecting surface to the first surface is reflected to be normally incident to the output surface.
13. The apparatus according to claim 7, wherein the double-reflecting dichroic beam splitter includes an output surface situated such the portion of the probe beam returned from the target and reflected by the dichroic reflecting surface to the first surface is reflected to be normally incident to the output surface.
14. The apparatus according to claim 8, wherein the double-reflecting dichroic beam splitter includes an output surface situated such the portion of the probe beam returned from the target and reflected by the dichroic reflecting surface to the first surface is reflected to be normally incident to the output surface.
15. The apparatus according to claim 6, wherein the double-reflecting dichroic beam splitter includes an output surface situated such the imaging beam returned from the target reflected by the dichroic reflecting surface to the first surface is reflected to be normally incident to the output surface.
16. The apparatus according to claim 7, wherein the double-reflecting dichroic beam splitter includes an output surface situated such the imaging beam returned from the target reflected by the dichroic reflecting surface to the first surface is reflected to be normally incident to the output surface.
17. The apparatus according to claim 8, wherein the double-reflecting dichroic beam splitter includes an output surface situated such the imaging beam returned from the target reflected by the dichroic reflecting surface to the first surface is reflected to be normally incident to the output surface.
18. The apparatus according to claim 6, wherein the double-reflecting dichroic beam splitter includes a first prism having a vertex angle between the first surface and the dichroic reflecting surface, wherein the vertex angle is greater than sin.sup.−1(1/n), wherein n is a refractive index of the prism.
19. The apparatus according to claim 7, wherein the double-reflecting dichroic beam splitter includes a first prism having a vertex angle between the first surface and the dichroic reflecting surface, wherein the vertex angle is greater than sin.sup.−1(1/n), wherein n is a refractive index of the prism.
20. The apparatus according to claim 8, wherein the double-reflecting dichroic beam splitter includes a first prism having a vertex angle between the first surface and the dichroic reflecting surface, wherein the vertex angle is greater than sin.sup.−1(1/n), wherein n is a refractive index of the prism.
21. The apparatus according to claim 18, wherein the dichroic reflecting surface of the double-reflecting dichroic beam splitter is defined on a surface of the first prism.
22. The apparatus according to claim 19, wherein the dichroic reflecting surface of the double-reflecting dichroic beam splitter is defined on a surface of the first prism.
23. The apparatus according to claim 20, wherein the dichroic reflecting surface of the double-reflecting dichroic beam splitter is defined on a surface of the first prism.
24. The apparatus according to claim 18, wherein the double-reflecting dichroic beam splitter includes a first prism and a second prism secured to each other at respective mating surfaces, and the dichroic reflective surface is situated at the mating surfaces.
25. The apparatus according to claim 19, wherein the double-reflecting dichroic beam splitter includes a first prism and a second prism secured to each other at respective mating surfaces, and the dichroic reflective surface is situated at the mating surfaces.
26. The apparatus according to claim 20, wherein the double-reflecting dichroic beam splitter includes a first prism and a second prism secured to each other at respective mating surfaces, and the dichroic reflective surface is situated at the mating surfaces.
27. The apparatus according to claim 24, wherein the dichroic reflecting surface is defined on at least one of the mating surfaces.
28. The apparatus according to claim 25, wherein the dichroic reflecting surface is defined on at least one of the mating surfaces.
29. The apparatus according to claim 26, wherein the dichroic reflecting surface is defined on at least one of the mating surfaces.
30. The apparatus according to claim 1, wherein the dichroic beam splitter includes a dichroic plate and a plane reflector, wherein the dichroic plate is situated to direct the portion of the probe beam returned from the target to the plane reflector and transmit the imaging beam to the image sensor.
31. The apparatus according to claim 1, wherein the dichroic beam splitter includes a dichroic plate and a plane reflector, wherein the dichroic plate is situated to reflect the imaging beam to the plane reflector and transmit the portion of the probe beam returned from the target.
32. The apparatus according to claim 1, wherein the optical fiber is a polarization retaining single mode (PRSM) optical fiber and further comprising a polarizing beam splitter (PBS) situated so that the probe beam from the PRSM optical fiber is received be the PBS in a state of polarization that is substantially transmitted by the PBS to the dichroic beam splitter.
33. The apparatus according to claim 32, wherein the state of polarization is a linear state of polarization.
34. The apparatus according to claim 32, further comprising a waveplate situated between the PBS and the dichroic beam splitter to produce a circular state of polarization in the probe beam and to reflect a portion of the probe beam towards the optical fiber to produce a local oscillator beam.
35. The apparatus according to claim 34, wherein the waveplate has an input surface situated to receive the probe beam from the PBS and an output surface situated to receive the probe beam from the input surface of the waveplate, wherein one of the input surface or the output surface is antireflection coated and the other of the input surface and the output surface reflects a portion of the probe beam as the local oscillator beam.
36. The apparatus according to claim 1, further comprising a mixing lens situated to receive the measurement beam from the optical fiber and a dichroic filter situated along the axis on an axial portion of the mixing lens, wherein the dichroic filter is transmissive to the measurement beam and non-transmissive to the tracer beam.
37. The apparatus according to claim 1, further comprising a dichroic reflector situated along the axis on an axial portion of the mixing lens, wherein the dichroic filter is a dichroic reflector that is transmissive to the measurement beam and reflective to the tracer beam.
38. The apparatus according to claim 36, wherein the dichroic filter is a wave-length-dependent polarizer that is substantially non-transmissive to the tracer beam.
39. The apparatus according to claim 37, wherein the dichroic filter is a wave-length-dependent polarizer that is substantially non-transmissive to the tracer beam.
40. The apparatus according to claim 1, further comprising a dichroic reflector situated along the axis on an axial portion of the mixing lens, the dichroic reflector transmissive to the measurement beam and reflective to the tracer beam, wherein a dimension of the dichroic reflector is based on a corresponding dimension of the image sensor.
41. The apparatus according to claim 1, further comprising a mixing lens situated to receive the measurement beam and focus the measurement beam; and a dichroic reflector situated along the axis on an axial portion of the mixing lens, the dichroic reflector transmissive to the measurement beam and reflective to the tracer beam, wherein a dimension of the dichroic reflector is based on a corresponding dimension of the image sensor.
42. The apparatus according to claim 37, wherein a dimension of the dichroic reflector is at least 0.5, 0.75, 1.0, or 1.5 times a product of a corresponding dimension of the image sensor and a ratio of an optical distance along the axis from the mixing lens focus to the dichroic reflector to an optical distance from the mixing lens focus to the image sensor.
43. The apparatus according to claim 40, wherein a dimension of the dichroic reflector is at least 0.5, 0.75, 1.0, or 1.5 times a product of a corresponding dimension of the image sensor and a ratio of an optical distance along the axis from the mixing lens focus to the dichroic reflector to an optical distance from the mixing lens focus to the image sensor.
44. The apparatus according to claim 41, wherein a dimension of the dichroic reflector is at least 0.5, 0.75, 1.0, or 1.5 times a product of a corresponding dimension of the image sensor and a ratio of an optical distance along the axis from the mixing lens focus to the dichroic reflector to an optical distance from the mixing lens focus to the image sensor.
45. The apparatus according to claim 37, wherein the dichroic filter is situated on a lens surface of the movable lens.
46. The apparatus according to claim 40, wherein the dichroic filter is situated on a lens surface of the movable lens.
47. The apparatus according to claim 41, wherein the dichroic filter is situated on a lens surface of the movable lens.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
[0050]
[0051]
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
DETAILED DESCRIPTION
[0066] As used in this application and in the claims, the singular forms “a,” “an,” and “the” include the plural forms unless the context clearly dictates otherwise. Additionally, the term “includes” means “comprises.” Further, the term “coupled” does not necessarily exclude the presence of intermediate elements between the coupled items. In some cases, elements are referred to as directly coupled to exclude intermediate elements.
[0067] The systems, apparatus, and methods described herein should not be construed as limiting in any way. Instead, the present disclosure is directed toward all novel and non-obvious features and aspects of the various disclosed embodiments, alone and in various combinations and sub-combinations with one another. The disclosed systems, methods, and apparatus are not limited to any specific aspect or feature or combinations thereof, nor do the disclosed systems, methods, and apparatus require that any one or more specific advantages be present or problems be solved. Any theories of operation are to facilitate explanation, but the disclosed systems, methods, and apparatus are not limited to such theories of operation.
[0068] Although the operations of some of the disclosed methods are described in a particular, sequential order for convenient presentation, it should be understood that this manner of description encompasses rearrangement, unless a particular ordering is required by specific language set forth below. For example, operations described sequentially may in some cases be rearranged or performed concurrently. Moreover, for the sake of simplicity, the attached figures may not show the various ways in which the disclosed systems, methods, and apparatus can be used in conjunction with other systems, methods, and apparatus. Additionally, the description sometimes uses terms like “produce” and “provide” to describe the disclosed methods. These terms are high-level abstractions of the actual operations that are performed. The actual operations that correspond to these terms will vary depending on the particular implementation and are readily discernible by one of ordinary skill in the art.
[0069] For convenience in the following description, the terms “light” and “optical radiation” refer to propagating electromagnetic radiation in a wavelength range of 300 nm to 10 μm, but other wavelengths can be used. Such radiation can be directed to one or more targets to be profiled, detected, or otherwise investigated. This radiation is referred to herein as propagating in one or more “beams” that are typically based on optical radiation produced by a laser such as a laser diode. As used in this application, beams need not be collimated, and propagating radiation in a waveguide is referred to as a beam as well. Beams can have a spatial extent associated with one or more laser transverse modes, and can be substantially collimated. Wavelengths for which optical fibers or other optical waveguides and coherent laser sources are readily available are convenient. In some examples, laser diodes at wavelengths around 1550 nm are used.
[0070] For convenience, beams are described as propagating along one or more axes. Such axes generally are based on one or more line segments so that an axis can include a number of non-collinear segments as the axis is bent or folded or otherwise responsive to mirrors, prisms, lenses, and other optical elements. The term “lens” is used herein to refer to a single refractive optical element (a singlet) or a compound lens that includes one or more singlets, doublets, or other compound lenses. In some examples, beams are shaped or directed by refractive optical elements, but in other examples, reflective optical elements such as mirrors are used, or combinations of refractive and reflective elements are used. Such optical systems can be referred to as dioptric, catoptric, and catadioptric, respectively. Other types of refractive, reflective, diffractive, holographic and other optical elements can be used as may be convenient. In some examples, beam splitters such as cube beam splitters are used to separate an input beam into a transmitted beam and a reflected beam. Either of these beams can be arranged to serve as measurement beam or a local oscillator beam in a coherent detection systems as may be convenient. Beam splitters can also be provided as fiber couplers, and polarizing beam splitters are preferred in some embodiments. The term “beam splitter” is also typically used to refer to beam combiners. Fiber couplers and fiber wavelength division multiplexers (WDMs) can combine or separate beams.
[0071] In the disclosed examples, laser radar systems are configured to scan a probe or measurement beam over a scan path that can be a polygon, portions of a closed curve, a raster, a w-pattern, or other pattern, and scanning can be periodic or aperiodic. In response to a measurement beam or a probe beam directed to a target, a return beam is obtained based on reflection, scattering, diffraction, refraction, or other process at the target. Evaluation of the return beam permits estimation of target properties. The examples below are provided with respect to a laser radar that is configured to, for example, provide an estimate of surface topography based on portions of an optical beam directed to a surface that are returned to a receiver. The disclosed methods and apparatus can also be incorporated into laser tracker systems.
[0072] In some examples described herein, a measurement optical beam is divided into a probe beam that is directed to a target, a reference beam that can be used for calibration by being directed to a reference length, and/or a local oscillator beam used for heterodyne detection and target distance estimation in combination with the probe beam. In other examples, a beam directed to a target is referred to as a probe beam and a portion returned for detection is referred to as a signal beam. In the disclosed examples, portions of one or more optical beams are directed to a target, detectors, or communicated from one to one or more destinations. As used herein, a beam portion refers to any fraction of an optical beam, including the entire optical beam. In many examples, a pointing or tracer beam propagates along with one or more probe beams to a target. The tracer beam is at a visible wavelength and permits a user to confirm that the probe beam is directed to the intended target location. Such a tracer beam is otherwise unused, and in some cases, undesirable tracer beam reflections can interfere with boresight camera images of a target. Probe beams are typically at wavelengths greater than about 900 nm, and frequently suitable beam sources have wavelengths around 1300 nm and 1500 nm. Other wavelengths can be used.
[0073] The disclosed systems generally include one or more beam splitters such as polarizing beam splitters (PBSs) and dichroic beam splitters (DBSs) such as cube or plate beam splitters. Beam splitting surfaces can be provided on plate surfaces, prism surfaces, lens surfaces or other curved or planar surfaces. As used herein, a DBS is a beam splitter that preferentially reflects (or transmits) in a first wavelength range and preferentially transmits (or reflects) in a second wavelength range. For convenient description, an angle (tilt angle) of a beam splitter surface is measured from an axis perpendicular to the beam splitter surface with respect to an optical axis. While PBSs and DBSs permit efficient use of probe beams and superior target image intensities, polarization and wavelength independent (neutral) beam splitters can also be used.
[0074] In some examples, rotations are described with reference to azimuthal angles and elevational angles. While such angles are typically defined with respect to vertical and horizontal axes, as used herein, orientation with vertical and horizontal is not required. Typically, systems are described with reference to such angles with systems assumed to be in a standard in-use orientation.
[0075] In typical examples described below, probe beams directed to targets are polarized, but unpolarized or randomly polarized beams can be used. Optical filters are referred to as non-transmissive for transmittances of 5%, 2%, 1%, or less. Beams such as probe and tracer beams can be focused at or near surfaces of interest. As used herein, a beam is referred to as being focused at a surface if a beam waist is within ±0.5, 1, 2, 5, or 10 Rayleigh ranges of the surface.
Swept Frequency Laser Radar
[0076] In the following, various configurations and aspects of laser radar systems are disclosed. The disclosed systems, system components, modules, and associated methods can be used in various laser radar systems. In typical examples, so-called swept frequency laser radar systems are provided. Typical coherent radar systems generally use one or more laser diode light sources. The laser diode frequency is directly modulated by modulating a laser diode injection current or modulating laser diode temperature or in some other way. The laser frequency is generally modulated with a waveform so as to produce a linear frequency sweep or linear “chirp.” Laser frequency ƒ(t) can then be expressed as a function of time t as:
ƒ(t)=ƒ.sub.0+(ƒ/
t)t=ƒ.sub.0+γt,
wherein ƒ.sub.0 is a laser initial frequency and γ=ƒ/
t is a rate of laser frequency change. Linear sweeps are not required and arbitrary laser frequency variations as a function of time are theoretically useful such as stepped or other discontinuous frequency variations, or continuous variations based on polynomial or other functions, but linear chirps are generally more convenient and practical. A frequency modulated (FM) measurement beam is focused at a target, and a portion of the beam is scattered, reflected, refracted or otherwise directed so as to be collected by receiver optics. A local oscillator beam (“LO beam”) is generally obtained as a portion of the same laser beam used to produce the measurement beam. A round trip transit time associated with measurement beam propagation to and from the target results in a frequency difference obtained when the returned portion of the measurement beam (the return beam) and the local oscillator are optically mixed. This frequency difference can be used to determine target distance. The return beam and the LO are directed to a detector such as a PIN photodiode (typically referred to as a square law detector) to produce sum and difference frequency signals. The sum frequency (at a several hundred THz for a 1.5 μm measurement beam) is beyond available detector bandwidth, but the return and LO beams also produce a difference frequency
ƒ (heterodyne frequency) within the detector bandwidth. A distance R to a target location can be calculated as R=c
ƒ/2γ, wherein
ƒ is the heterodyne frequency associated with the return beam, γ is the chirp rate, and c is the speed of light. Heterodyne frequency generation also requires that the LO and return beam are not orthogonally polarized, but since range is determined based on frequency differences and not amplitudes, polarization effects reduce heterodyne signal level but heterodyne frequency is unchanged.
[0077] Successful laser radar systems control or measure laser frequency precisely as the accuracy of range measurements can be limited by the linearity of laser frequency modulation. For example, if a target is one meter distant, a linearity of one part per thousand is necessary to ensure 1 mm accuracy. Accordingly, laser sources for FM laser radar are configured to provide highly linear chirps, and variances from linearity are detected and compensated. In some cases, range measurements can have precisions in the few micron range.
[0078] FM laser radar systems are largely immune to ambient lighting conditions and changes in surface reflectivity because signal detection is based on heterodyne beat frequency, which is independent of signal amplitude and unaffected by stray radiation. Thus, amplitude or intensity variations in the return beam, the measurement beam, or the LO beam tend to have little effect on range measurements. In addition, coherent heterodyne detection can successfully detect optical signals to the shot noise limit so that FM coherent laser radars can make reliable measurements with as little as one picowatt of return beam power, corresponding to a nine order-of-magnitude dynamic range.
[0079] In some examples below, systems using probe beams at either one or two wavelengths are illustrated. Generally, one or more probe beams can be used, and the use of two counter-chirped beam permits compensation, correction, or elimination of Doppler shift errors associated with relative motion between a laser radar and a target.
[0080] In some examples, the disclosed systems use a single objective lens (with fixed and movable lenses) to direct and focus probe and signal beams to and from a target and produce an image of the target. This tends to preserve alignment of boresight images and probe beams. Due to differences in probe beam and image beam wavelengths and the high numerical aperture (NA) to be used, chromatic aberration correction can be challenging, and imaging through a dichroic prism-type beam splitter can introduce significant amounts of other aberrations such as coma. If an additional lens is used to shape a combined probe/tracer beam prior to focusing by an objective lens, the additional lens can be used to compensate probe beam aberrations (such as chromatic aberration between the visible wavelength tracer beam and the infrared wavelength of the probe beam) independently of the imaging beam. In some cases, use of such an additional lens causes excessive tracer beam reflection, and shaping a tracer beam can reduce beam portions such as reflected portions that might interfere with target imaging. In some examples, the additional lens provides a beam focus at optical surface as such as a waveplate surface to produce a local oscillator (LO) beam using a cat-eye retroreflector arrangement to provide LO stability.
[0081] In the following, representative examples of the disclosed technology are provided for convenient illustration. Any of the features and aspects of any example can be combined with features and aspects of other examples.
Example 1
[0082] Referring to
[0083] The objective lens 108 also receives portions of the probe and tracer beams returned from the target 116 along with an imaging beam typically based on broadband or ambient illumination of the target 116. The returned portion of the probe beam is directed through the beam splitter 104 to the fiber end 102 so as to propagate in the fiber 101. The imaging beam is coupled by a beam splitter surface 106 to an image sensor 118 along with a portion of the tracer beam. The beam splitter surface 106 is generally a thin film dichroic filter that preferentially transmits the probe beam and reflects the imaging beam (or reflects the probe beam and transmits the imaging beam, as desired). The probe and tracer beams are focused on the target 116 and the imaging beam is focused on the image sensor 118 by adjusting a position of the movable lens 110 along the axis 120. The objective lens 108 thus must operate over a large wavelength range (for example, 250 nm to 1700 nm). However, by using a single lens 108 for probe, tracer, and imaging beams, beam alignment is maintained, and beams are not displaced during beam scanning. As shown in
Example 2
[0084] With reference to
[0085] The focus and probe beams are scanned with an elevational reflector 220 that is secured to shaft 222 that is retained by a bearing 224 to be rotatable about the axis 204 that is parallel to a z-axis of a coordinate system 250. Rotation of the shaft is measured with an encoder 230 that situated at the shaft 216. Some components are situated in a housing 232.
Example 3
[0086]
[0087] The dichroic reflector 312 is situated to direct an imaging beam received from the target and objective lens to the prism surface 314 so that the imaging beam is reflected by, for example, total internal reflection, to the prism surface 320. The angle θ is generally selected to provide total internal reflection at the surface 314, but coatings can be provided to provide suitable reflectivity. Angles θ that are greater than 45 degrees reduce angles of incidence of beams to the dichroic reflector 312 so that the dichroic reflective exhibits fewer angle-dependent variations, such as variations in reflectivity as a function of wavelength and/or variations in reflectivity as a function of state of polarization. For example, the angle θ can be greater than 50°, 55°, 60°, 65°, 70°, 75°, or more, reducing beam angles of incidence.
Example 4
[0088] With reference to
[0089] A dichroic filter (beamsplitter) having the characteristics shown in
Example 5
[0090] Referring to
Example 6
[0091] In another example shown in
[0092] In order to acquire quality images at the image sensor 672, chromatic aberration at visible wavelengths should be sufficiently reduced. However, obtaining acceptably low values of chromatic aberration at both visible wavelengths and between probe (IR) and pointing (red lasers) is a demanding lens design challenge.
Example 7
[0093]
[0094] Referring to
Example 8
[0095] Referring to
[0096] Representative filters 850, 860 are illustrated in
Example 9
[0097]
[0098] As shown in
[0099] Another approach to reducing tracer beam portions from reaching an image sensor is illustrated in
Example 10
[0100] With reference to
[0101] The combination of the mixing lens 1014 and the waveplate 1016 serves as a cat-eye retroreflector 1030 that tends to be insensitive to tilts of the waveplate 1016. Portions of the probe beam or beams and the local oscillator beam or beams are directed by the PBS 1012 to a detector assembly 1032 that typically includes respective detectors coupled to receive probe beam portions and LO portions for each probe beam wavelength.
Example 11
[0102]
Example 12
[0103] Referring to
[0104] The fiber end 1218 couples the combined beams to mixing optics 1234. A PBS 1236 receives the combined beams and a mixing lens 1238 forms a beam focus at a surface 1240 of a quarter waveplate 1239 which reflects local oscillator portions back to the PBS 1236. Other portions of the combined beams are directed to a projection optical system 1242 in which the beams propagate through a beam splitter 1244 (shown as a plate, but cube, double-reflecting or others can be used) to an objective lens having a fixed lens 1248 and a movable lens 1246 for focusing the combined beams at the target 1250. The beam splitter directs an imaging beam to a camera 1257. One or more scanning mirrors 1249 (typically as illustrated in
[0105] As shown in
[0106]
[0107]
Example 13
[0108]
Example 14
[0109]
[0110] As shown in
[0111] As shown in
Example 15
[0112]
[0113] The example of
Example 16
[0114] Referring to
Example 17
[0115] An alternative optical system 1680 is shown in
Example 18
[0116] As shown in
[0117]
[0118]
Example 19
[0119] Laser radar can include a camera that is aligned along the radar axis. Such implementations can use inexpensive surveillance cameras having calibration parameters that vary with spatial orientation to gravity and environmental conditions such as temperature. Camera data is processed and presented independent of laser scan data, and real time coordination between the camera and the laser radar data may be difficult possible. In some disclosed examples, a metrology camera is situated to use common focusing optics with a laser radar measurement path; such a camera is referred to herein as a confocal camera and the associate laser radar referred to as a confocal Laser Radar (cLR). This provides measurements over 6 degrees of freedom (DOF) between the camera and the laser radar. Such a camera can be a high definition camera and coupled to provide camera data at a low level in the system architecture to minimize or reduce latency between the data allowing real time coordination of the LR and camera data. Using the two measurement modes (LR and camera) in a confocal Laser Radar allows the LR to be pointed to optimally measure a feature of interest. Additionally, a low latency data interface allows real time algorithms and the tracking of features identifiable in the camera.
[0120] The Laser Radar measures the azimuth, elevation, and range to a surface of interest. Azimuth and Elevation are read from encoders on the appropriate shafts. The range measurement is accomplished with heterodyne interferometry and can be made on almost all surfaces without interference from ambient light. The conversion of Range (R), Azimuth (A) and Elevation (E) into rectilinear coordinates XYZ is accomplished through well-known spherical coordinate to Cartesian coordinate conversions such as:
X.sub.LR=R*cos(E)*cos(A)
Y.sub.LR=R*cos(E)*sin(A)
Z.sub.LR=R*sin(E)
A calibrated camera can be viewed as an angle measurement device where the azimuth and elevation of every pixel in the picture can be determined. With the LR and the camera having a confocal relationship, the range measurement can provide scale to the camera image. This relationship allows the center pixel of the camera to be directly related to XYZ.sub.LR. While it cannot be guaranteed that the projection of the camera focal plane onto the scene is perpendicular to the central axis of the |LR, the actual relationship can be determined through a calibration process. With a calibrated camera, planar features can be measured directly by the camera once the range is determined by the LR. Other features with a known geometry such as spheres can also be measured once range is established.
[0121] Referring back to
Example 20
[0122] Referring to
Example 21
[0123] Referring to
[0124] In other examples, fast alignments can be obtained prior to measurement. In many applications, before the system measures the features of interest an alignment to the part must be performed. The alignment can be of two types: (1) absolute where the laser radar measures a set of tooling balls that have a known relationship to the part, or (2) an alignment to a set of features. As mentioned above, searching with a camera allows the features to be found quickly. For tooling balls, the confocal laser radar has additional advantages. The camera can be used to center the laser radar on the tooling ball. In all algorithms it is generally presumed that the radius of the tooling ball is known so that a surface measurement of the tooling ball can be projected to the center of the tooling ball. After centering, four different algorithms can be used: (1) for a shiny tooling ball, presume the camera has centered the laser radar correctly and simply measure the range to the surface, (2) for a shiny tooling ball, perform a W-shaped laser radar scan to determine the precise angle to the tooling ball and then measure the range to the surface, (3) for a matte tooling ball, presume the camera has centered the laser radar correctly and simply measure the range to the surface, and (4) for a matte tooling ball, scan the surface to then perform a sphere fit to determine the position of the tooling ball. In all cases the ability to center with the camera improves speed and overall productivity.
[0125] The camera can measure features (such as tooling balls) in conjunction with the laser radar range measurement. In addition, the camera can measure planar features such as holes, slots, polygons, etc. For these types of measurements, there is only a void in the center of the feature. Therefore, the laser radar system must intentionally offset the camera field of view to pointing to the surface around the feature.
Example 22
[0126] Referring to
[0127]
[0128] While an ES is particularly convenient, other tracking targets such as corner cubes mounted in spheres (referred to as “Spherically Mounted Retroreflectors” or “SMRs”) can be used. Such targets should have a corner cube reflection point at a sphere center, otherwise errors may result. The ES does not exhibit misalignment in response to mishandling, unlike SMRs. The various areas of an ES can be provided with paint, be etched, frosted, or coated with a reflective, metallic, dichroic or other coating.
[0129] An ES or other tracking target permits measurement of target areas having high angles of incidence to a laser radar 2304 or are hidden. With reference to
[0130] In some examples, an ES is formed by modifying a precision sphere by adding rings of different colors. The rings could also be filled with retroreflective paint making them highly visible with a flash. The spheres can either be made of matte material or shiny creating three measurement modes: (1) matte where the angles are used from the camera and the range to the center of the sphere come from the laser radar, (2) a matte sphere fit on the spherical surface, or (3) a W-shaped laser radar scan on the specular point to find the angles and then a range measurement to the specular point. In modes 2 and 3 the laser radar makes all the measurements and the camera is centering the laser radar on the sphere and detecting that the sphere is not moving. In mode 1 the camera still is used for tracking and detection of lack of motion, however the angular measurements of the camera are combined with the LR measurements making the measurement almost instantaneous.
Example 23
[0131] Another type of hidden point tool can also be used with tracking. As shown in
[0132] With two eyeball spheres, two measurements are made which are the XYZ positions of the two eyeball spheres (XYZ.sub.1, XYZ.sub.2). The distance between XYZ.sub.1 and XYZ.sub.2 is not critical but the distance D.sub.m between ES 2512 and the measurement sphere 2403 must be known. Superior measurement results are obtained if the centers of all three spheres are collinear. The center of the measurement sphere 2403 is projected to the surface of the target using normal techniques. A sample calculation of the XYZ of the measurement sphere 2403 is:
Such measurements are practical because low latency allows each sphere ball to be measured in a few tenths of a second. While a tool having two fixedly separated eyeball spheres and a measurement sphere is convenient, such a tool can use a single eyeball sphere that can be moved to differing positions along a shaft 2414. Measurements at each position can then be used.
Example 24
[0133] Automated measuring systems using laser radar can require expensive and time intensive setup processes and take weeks to complete, requiring skilled personnel. Disclosed herein are systems that take advantage of a metrology grade high definition (HD) or other camera embedded in the laser radar. Machine learning algorithms are provided for identifying and/or measuring features such as holes, slots, studs and others features. So-called “collaborative robots” (typically including mirrors or other optical systems) permit blind spot measurements, and local tracking of the measuring device to reduce setup time and speed up measurement time.
[0134] In some disclosed examples, tooling balls that are placed about the part to be measured are not needed, and augmented reality applications can overlay CAD images of parts and a camera image. This allows automated detection of parts and can be used to direct the laser radar to measure/scan selected target areas. Lengthy laser radar scans are then not needed to locate target areas of interest. For some applications, an additional mirror is used with a laser radar for measurements of hidden or difficult to reach points that are not in a direct line of sight. Such mirrors are generally small, and therefore provide a limited field of view from a fixed position. By attaching such mirrors to a robot, this limited field of view can be greatly extended with automated movement. The use of collaborative robots allows easy positioning of the mirror and the measurement area does not need to be guarded for safety. The collaborative robot therefore can position the mirror in multiple, repeatable, and stable positions allowing for greater field of view than a static mirror position and also allowing for more measurements from a single laser radar position.
[0135] With reference to
[0136] Following this setup, the COBOT 2504 can be driven to each of a plurality of programmed positions and the laser radar 2500 can automatically measure the tooling balls based on the nominal values obtained earlier. This allows for automatic, accurate determination of mirror position for use in sample measurements. In some cases, typically those in which a lower accuracy is sufficient, robot repeatability may be sufficient.
[0137] In order to coordinate mirror measurements and COBOT positioning, digital or physical IO from the COBOT is provided with either a direct connection to the measurement PC or through a programmable logic controller (PLC) based on OPC, Profinet or other standard PLC interfaces. Interface software on the PC can coordinate with movement and in-position signals from the COBOT and measurement signals from the laser radar. This may comprise separate software platforms connecting to teach other or may be part of a single software suite to control both the communications to the PLC and the laser radar itself.
[0138] The use of the COBOT 2504 and the mirror 2506 in combination with the attached tooling ball 2528 allows faster measurements, permits measurements with reduced laser radar or part repositioning, and reduces a number of tooling ball measurements required. Multiple mirror positions can be made for a single laser radar position and mirror movements could be made during other measurements of the part, reducing dead measurement time, or simultaneously during repositioning moves. The COBOT 2504 does not necessarily require safety fencing or zoning and therefore can be placed close to the part and even move while operators are nearby. Automatic cleaning of the mirror 2506 can be based on force feedback of the COBOT 2504 through a pad, or from air combs or blowers to prevent deposition of material on the mirror surface.
[0139]
Example 25
[0140] A boresight camera/laser radar system permits acquisition of a target image by stitching together multiple images associated with different portions of the target. Each camera image can be associated with a target distance obtained with the laser radar, and any camera tilt can be compensated using features of known shape as discussed above. In a representative method 2600 shown in
[0141] In some cases, image stitching produces superior results after camera/probe beam calibration. For example, in some examples, a camera field of view center is determined based on one or more images that include image portions corresponding to a location on the target at which a probe beam/tracer beam is incident. In another example, images of a grid pattern can be evaluated to determine image distortion introduced by the projection lens used by the camera and the probe beam. Such distortions can be corrected or compensated for images to be stitched.
Example 26
[0142] With reference to
Example 27
[0143] In laser radars that include an appropriate imaging system (such as a high definition camera), the laser radar probe beam can be directed to a region of interest based on features selected from target images. In such measurements, tooling balls are not needed. In addition, a target design (such as a CAD image) can be overlaid or otherwise displayed with a camera image for part evaluation. Features to be evaluated can be identified from camera images, and scan paths produced for feature measurement. The laser radar can be driven with these scan paths for part assessment. As shown above, a mirror that is positioned at a collaborative robot can be used for measurements that would otherwise be impossible without repositioning of the laser radar. In some cases, the collaborative robot is controlled based on a selected scan path. A collaborative robot can be situated in spaces that require strict safety measures for human operators, thus simplifying the measurement process. In some cases, a location of a laser radar or other measurement apparatus can be determined using GPS, gyroscopes, and/or accelerometers; in some cases, such tracking can eliminate the need to use tooling balls for alignment.
[0144] By contrast, conventional laser radar requires four tooling balls be situated on a part for each robot position, and typically 10 or more robot positions are required. Laser radar that can be aligned without tooling balls as disclosed herein can permit rapid, simple part setup and measurement. Using a camera as discussed above, machine learning can be used to detect features and identify those that appear to be in the wrong position, and adjust feature parameters, without reliance on an assumption that the part conforms to a corresponding CAD design.
Example 28
[0145]
[0146] The design system 2810 is configured to create design information corresponding to shape, coordinates, dimensions, or other features of a structure to be manufactured, and to communicate the created design information to the shaping system 2820. In addition, the design system 2810 can communicate design information to the coordinate storage 2831 of the controller 2830 for storage. Design information typically includes information indicating the coordinates of some or all features of a structure to be produced.
[0147] The shaping system 2820 is configured to produce a structure based on the design information provided by the design system 2810. The shaping processes provided by the shaping system 2820 can include casting, forging, cutting, or other process. The shape measurement system 2805 is configured to measure the coordinates of one or more features of the manufactured structure and communicate the information indicating measured coordinates or other information related to structure shape to the controller 2830.
[0148] A manufacture inspector 2832 of the controller 2830 is configured to obtain design information from the coordinate storage 2831, and compare information such as coordinates or other shape information received from the profile measuring apparatus 100 with design information read out from the coordinate storage 2831. The manufacture inspector 2832 is generally provided as a processor and a series of computer-executable instructions that are stored in a tangible computer readable medium such as random access memory, a flash drive, a hard disk, or other physical devices. Based on the comparison of design and actual structure data, the manufacture inspector 2832 can determine whether or not the manufacture structure is shaped in accordance with the design information, generally based on one or more design tolerances that can also be stored in the coordinate storage 2831. In other words, the manufacture inspector 2832 can determine whether or not the manufactured structure is defective or nondefective. When the structure is not shaped in accordance with the design information (and is defective), then the manufacture inspector 2832 determines whether or not the structure is repairable. If repairable, then the manufacture inspector 2832 can identify defective portions of the manufactured structure, and provide suitable coordinates or other repair data. The manufacture inspector 2832 is configured to produce one or more repair instructions or repair data and forward repair instructions and repair data to the repair system 2840. Such repair data can include locations requiring repair, the extent of re-shaping required, or other repair data. The repair system 2840 is configured to process defective portions of the manufactured structure based on the repair data.
Example 29
[0149]
[0150] According to the method of
[0151] In the above embodiment, the structure manufacturing system 2800 can include a profile measuring system such as the laser radars and associated optical systems disclosed herein, the design system 2810, the shaping system 2829, the controller 2830 that is configured to determine whether or not a part is acceptable (inspection apparatus), and the repair system 2840. However, other systems and methods can be used and examples of
Example 30
[0152] Additional embodiments of reference assemblies for laser-based measurement systems such as disclosed above are shown in
[0153] The enclosure 3002 is typically made of copper and temperature controlled with a thermoelectric (TEC) module 3083 and control electronics 3084. The outside of the enclosure 3002 is typically provided with insulation (not shown) that surrounds the enclosure 3002 to insulate the enclosure 3002 from the ambient environment. The enclosure 3002 has a lid that is sealed with bolts and an O-ring. A tube 3082 can be provided for evacuation or filling of the enclosure 3002 with a noble gas or other gas such as nitrogen. The tube 3082 can be situated on an enclosure lid and be made of copper. Such a tube is generally pinched off or otherwise sealed after evacuation or filling of the enclosure 3002. Copper is a convenient material, but other materials can be used. In some cases, the enclosure is filled with a dry gas.
[0154] Referring to
[0155] The second coupler 3055 directs portions of the combined probe beams from the isolators 3054A, 3054B and then to a third coupler 3060 which divides the combined beam portions into first and second portions that propagate along respective paths 3062A, 3062B. If desired, fiber delay lengths 3045A, 3045B can be situated between the fiber feed-throughs 3043B, 3043C and the isolators 3054A, 3054B. The paths 3062A, 3062B typically have a stable, fixed path difference provided by including an additional fiber length 3047 in one of these paths. A third coupler 3064 receives the first and second beam portions from the paths 3062A, 3062B and combines these portions and directs the combined portions to respective reference detectors 3050A, 3050B via the fiber feed-throughs 3043D, 3043E. The fixed path difference permits association of a beat signal between the first and second beam portions with a specific length. In most practical examples, optical filters 3080A, 3080B are situated so that reference detector 3050A receives only beam portions at a first wavelength provided by the first probe laser 3049A and the reference detector 3050B receives only beam portions at a second wavelength provided by the second probe laser 3049B. For example, the first and second wavelengths can be about 1550 nm and 1560 nm. Fiber delay lengths 3045A, 3045B can be situated between the fiber feed-throughs 3043B, 3043C and the isolators 3054A, 3054B so that reflections from internal components are produce heterodyne frequencies that can be out of typical measurement range. Wavelength demultiplexing couplers can be used instead of the third coupler 3064 and optical filters 3080A, 3080B to separate the wavelengths.
[0156] In the examples of
Example 31
[0157] Referring to
[0158] Autofocus provided with the focus controller and the translation mechanism 3126 permits the probe beam focus to be maintained as the probe beam scans various target areas. In conventional systems, establishing probe beam focus on the target can be time consuming Using a confocal image sensor permits rapid focus adjustments using target images produced at the image sensor 3122. Thus, focus can be established and adjusted, and a probe beam can be directed to any selection portion of a field of view using the image sensor 3122 and the focus controller 3124. A non-transitory computer-readable memory or network connection 3130 receives images from the image sensor 312 for processing to identify features, to stitch images together to provide a panoramic image of the target.
Example 32
[0159]
[0160] As shown in
Example 33
[0161]
[0162] With reference to
[0163] The exemplary PC 3300 further includes one or more storage devices 3330 such as a hard disk drive for reading from and writing to a hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to a removable optical disk (such as a CD-ROM or other optical media). Such storage devices can be connected to the system bus 3306 by a hard disk drive interface, a magnetic disk drive interface, and an optical drive interface, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer-readable instructions, data structures, program modules, and other data for the PC 3300. Other types of computer-readable media which can store data that is accessible by a PC, such as magnetic cassettes, flash memory cards, digital video disks, CDs, DVDs, RAMs, ROMs, and the like, may also be used in the exemplary operating environment.
[0164] A number of program modules may be stored in the storage devices 3330 including an operating system, one or more application programs, other program modules, and program data. A user may enter commands and information into the PC 3300 through one or more input devices 3340 such as a keyboard and a pointing device such as a mouse. Other input devices may include a digital camera, microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the one or more processing units 3302 through a serial port interface that is coupled to the system bus 3306, but may be connected by other interfaces such as a parallel port, game port, or universal serial bus (USB). A monitor 3346 or other type of display device is also connected to the system bus 3306 via an interface, such as a video adapter. Other peripheral output devices, such as speakers and printers (not shown), may be included.
[0165] The PC 3300 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 3360. In some examples, one or more network or communication connections 3350 are included. The remote computer 3360 may be another PC, a server, a router, a network PC, or a peer device or other common network node, and typically includes many or all of the elements described above relative to the PC 3300, although only a memory storage device 3362 has been illustrated in
[0166] When used in a LAN networking environment, the PC 3300 is connected to the LAN through a network interface. When used in a WAN networking environment, the PC 3300 typically includes a modem or other means for establishing communications over the WAN, such as the Internet. In a networked environment, program modules depicted relative to the personal computer 3300, or portions thereof, may be stored in the remote memory storage device or other locations on the LAN or WAN. The network connections shown are exemplary, and other means of establishing a communications link between the computers may be used.
[0167] In view of the many possible embodiments to which the principles of the disclosed technology may be applied, it should be recognized that the illustrated embodiments are only preferred examples and should not be taken as limiting the scope of the disclosure.
Innovative Features
[0168] Innovative features described herein include, but are not limited to, the following.
TABLE-US-00001 Feature A1 An apparatus, comprising: a beam splitter situated to receive a probe beam propagating along an axis; an objective lens situated on the axis to receive a probe beam from the beam splitter and direct the probe beam to a target along the axis, the objective lens including at least one optical element; and an image sensor optically coupled to the beam splitter and situated on the axis to receive an imaging beam from the target via the beam splitter, wherein the at least one optical element is movable to form a target image at the image sensor and focus the probe beam at the target. A2 The apparatus of A1, further comprising: an optical fiber situated to direct the probe beam and a tracer beam to the beam splitter along the axis, wherein the beam splitter is a dichroic beam splitter, and the objective lens comprises a fixed lens and the at least one movable optical element. A3 The apparatus of A1, further comprising an autofocus mechanism coupled to the at least one movable optical element to focus the target image at the image sensor and the probe beam at the target. A4 The apparatus of A1, wherein the probe beam is focused by the at least one movable optical element proximate a center of a field of view of the image sensor. A5 The apparatus of A1, further comprising a focus controller coupled to the movable optical element and configured to adjust the focus of the probe beam and the imaging beam. A6 The apparatus of A5, further comprising a beam scanner situated to direct the probe beam to the target and the imaging beam to the image sensor. A7 The apparatus of A6, further comprising an image processor coupled to the image sensor and the beam scanner so that a selected portion of the target is imaged at the image sensor in a predetermined location of a sensor surface of the image sensor. A8 The apparatus of A7, wherein the predetermined location is a central location the sensor surface of the image sensor. A9 The apparatus of A6, further comprising an image processor coupled to the image sensor and the beam scanner so that the probe beam is directed to a selected portion of the target based on a target image produced by the image sensor. A10 The apparatus of A1, further comprising a probe beam lens situated to direct the probe beam to the objective lens, the probe beam lens situated so that the probe beam and the imaging beam are focused at the target and the image sensor, respectively. A11 The apparatus of A1, further comprising an optical fiber having a fiber end situated to direct the probe beam to the beam splitter, wherein the image sensor and the optical fiber end are optically conjugate at one or more wavelengths associated with the imaging beam. B1 A method, comprising: directing a laser radar probe beam along an axis to an objective lens situated to focus the laser radar probe beam at a target; and directing an imaging beam from the target along the axis to the objective lens and to an image sensor, wherein the image sensor is situated so that the objective lens produces an image of at least a portion of the target at the image sensor. B2 The method of B1, further comprising focusing the laser radar probe beam at the target based on a contrast of an image of the target formed at the image sensor. B3 The method of B1, further comprising, adjusting the focus of the probe beam at the target based on the image of the target formed at the image sensor. B4 The method of B1, further comprising estimating a range to the target based on the returned portion of the probe beam. B5 The method of B1, further comprising estimating a dimension of a target feature based on the estimated range to the target and the image of the target. B6 The method of B1, further compromising, with an image processor, finding and centering a designated portion of the target in the image, thereby directing the probe beam to the designated portion of the target. B7 The method of B1, measuring, with a processor coupled to the image sensor, at least one target dimension using an angle determined based on a target distance. B8 The method of B1, further comprising estimating at least one target dimension based on a target feature size in the target image and a distance to the target. B9 The method of B1, further comprising tracking a moving target based on the image of the at least a portion of the target. B10 The method of B1, further compromising forming a segmented image of the target by obtaining a plurality of images of the target obtained with the image sensor and associated distances to the targets determined based on returned portions of the probe beam. B11 The method of B10, further compromising producing a 3D model of the target based on the segmented image. B12 The method of B11, further comprising locating at least one selected target area in the target based on the panoramic image. B13 The method of B1 in combination with one or more or all of the features of any of B2-B12. C1 An apparatus, comprising: a dichroic beam splitter; an optical fiber situated to direct a probe beam and a tracer beam to the dichroic beam splitter along an axis; an objective lens situated on the axis and comprising a fixed lens and a movable lens, the movable lens situated to receive the probe beam from the dichroic beam splitter and direct the probe beam to a target along the axis; and an image sensor optically coupled to the dichroic beam splitter and situated on the axis to receive an imaging beam from the target via the dichroic beam splitter, wherein the movable lens is translatable to form a target image at the image sensor and focus the probe beam at the target. C2 The apparatus of C1 wherein the dichroic beam splitter is situated so that the probe beam is transmitted through the dichroic beam splitter to the movable lens and the imaging beam is reflected by the dichroic beam splitter to the image sensor. C3 The apparatus of any of C1-C2, wherein the objective lens is situated to receive a tracer beam from the dichroic beam splitter and direct the probe beam and the tracer beam to the target, wherein the probe beam has a wavelength between 1200 nm and 1800 nm and the tracer beam has a wavelength between 400 nm and 700 nm. C4 The apparatus of any of C1-C3, wherein the dichroic beam splitter is situated so that the probe beam is reflected by the dichroic beam splitter to the movable lens and the imaging beam is transmitted by the dichroic beam splitter to the image sensor. C5 The apparatus of any of C1-C4, wherein the dichroic beam splitter is a cube dichroic beam splitter, a plate dichroic beam splitter, or a double-reflecting dichroic beam splitter. C6 The apparatus of any of C1-C5, wherein the dichroic beam splitter is a double- reflecting dichroic beam splitter that includes a first surface facing the movable lens and a dichroic reflecting surface situated to direct the imaging beam to the image sensor and the portion of the probe beam returned from the target to the optical fiber. C7 The apparatus of any of C1-C6, wherein the dichroic beam splitter is a double- reflecting dichroic beam splitter that includes a first surface facing the movable lens and a dichroic reflecting surface situated to direct the imaging beam to the first surface so that the imaging beam is reflected to the image sensor by the first surface, and the portion of the probe beam returned from the target to the optical fiber is transmitted by the reflecting surface to the optical fiber. C8 The apparatus of any of C1-C7, wherein the dichroic beam splitter is a double- reflecting dichroic beam splitter that includes a first surface facing the movable lens and a dichroic reflecting surface situated to direct the portion of the probe beam returned from the target to the first surface, and the imaging beam is transmitted by the dichroic reflecting surface to the image sensor. C9 The apparatus of any of C1-C8, wherein the first surface is situated at an angle greater than a critical angle with respect to the imaging beam received from the dichroic reflecting surface. C10 The apparatus of any of C1-C9 wherein the double-reflecting dichroic beam splitter includes an output surface situated such the portion of the probe beam returned from the target and reflected by the dichroic reflecting surface to the first surface is reflected to be normally incident to the output surface. C11 The apparatus of any of C1-C10, wherein the double-reflecting dichroic beam splitter includes an output surface situated such the imaging beam returned from the target reflected by the dichroic reflecting surface to the first surface is reflected to be normally incident to the output surface. C12 The apparatus of any of C1-C11, wherein the double-reflecting dichroic beam splitter includes a first prism having a vertex angle between the first surface and the dichroic reflecting surface, wherein the vertex angle is greater than sin − 1(1/n), wherein n is a refractive index of the prism. C13 The apparatus of any of C1-C12, wherein the dichroic reflecting surface of the double-reflecting dichroic beam splitter is defined on a surface of the first prism. C14 The apparatus of any of C1-C13, wherein the double-reflecting prism includes a first prism and a second prism secured to each other at respective mating surfaces, and the dichroic reflective surface is situated at the mating surfaces. C15 The apparatus of any of C1-C14, wherein the dichroic reflecting surface is defined on at least one of the mating surfaces. C16 The apparatus of any of C1-C15, wherein the dichroic beam splitter includes a dichroic plate and a plane reflector, wherein the dichroic plate is situated to direct the portion of the probe beam returned from the target to the plane reflector and transmit the imaging beam to the image sensor. C17 The apparatus of any of C1-C16, wherein the dichroic beam splitter includes a dichroic plate and a plane reflector, wherein the dichroic plate is situated to reflect the imaging beam to the plane reflector and transmit the portion of the probe beam returned from the target. C18 The apparatus of any of C1-C17, wherein the optical fiber is a polarization retaining single mode (PRSM) optical fiber and further comprising a polarizing beam splitter (PBS) situated so that the probe beam from the PRSM optical fiber is received be the PBS in a state of polarization that is substantially transmitted by the PBS to the dichroic beam splitter. C19 The apparatus of any of C1-C18, wherein the state of polarization is a linear state of polarization. C20 The apparatus of any of C1-C19, further comprising a waveplate situated between the PBS and the dichroic beam splitter to produce a circular state of polarization in the probe beam and to reflect a portion of the probe beam towards the optical fiber to produce a local oscillator beam. C21 The apparatus of any of C1-C20, wherein the waveplate has an input surface situated to receive the probe beam from the PBS and an output surface situated to receive the probe beam from the input surface of the waveplate, wherein one of the input surface or the output surface is antireflection coated and the other of the input surface and the output surface reflects a portion of the probe beam as the local oscillator beam. C22 The apparatus of any of C1-C21, further comprising a mixing lens situated to receive the measurement beam from the optical fiber and a dichroic filter situated along the axis on an axial portion of the mixing lens, wherein the dichroic filter is transmissive to the measurement beam and non-transmissive to the tracer beam. C23 The apparatus of any of C1-C22, further comprising a dichroic reflector situated along the axis on an axial portion of the mixing lens, wherein the dichroic filter is a dichroic reflector that is transmissive to the measurement beam and reflective to the tracer beam. C24 The apparatus of any of C1-C23, wherein the dichroic filter is a wave-length- dependent polarizer that is substantially non-transmissive to the tracer beam. C25 The apparatus of any of C1-C24, further comprising a dichroic reflector situated along the axis on an axial portion of the mixing lens, the dichroic reflector transmissive to the measurement beam and reflective to the tracer beam, wherein a dimension of the dichroic reflector is based on a corresponding dimension of the image sensor. C26 The apparatus of any of C1-C25, further comprising a mixing lens situated to receive the measurement beam and focus the measurement beam; and a dichroic reflector situated along the axis on an axial portion of the mixing lens, the dichroic reflector transmissive to the measurement beam and reflective to the tracer beam, wherein a dimension of the dichroic reflector is based on a corresponding dimension of the image sensor. C27 The apparatus of any of C1-C26, wherein a dimension of the dichroic reflector is at least 0.5, 0.75, 1.0, or 1.5 times a product of a corresponding dimension of the image sensor and a ratio of an optical distance along the axis from the mixing lens focus to the dichroic reflector to an optical distance from the mixing lens focus to the image sensor. C28 The apparatus of any of C1-C27, wherein the dichroic filter is situated on a lens surface of the movable lens. C29 The apparatus of C1 in combination with one or more or all of the features of any of C2-C28. D1 An apparatus, comprising: an optical fiber; an imaging lens situated to receive a measurement beam from the optical fiber and produce a measurement beam focus; an optical element having a surface situated proximate the measurement beam focus to reflect a portion of the measurement beam back into the optical fiber as a local oscillator beam; and an objective lens situated to receive the measurement beam from the optical element, direct a portion of the measurement beam as a probe beam to a target and direct a portion of the probe beam returned from the target into the optical fiber through the optical element and the imaging lens to form a signal beam. D2 The apparatus of D1, wherein the optical element is a waveplate. D3 The apparatus of any of D1-D2, wherein the waveplate has an entrance surface that receives the measurement beam from the imaging lens and an exit surface opposite the entrance lens, wherein the exit surface is situated proximate the measurement beam focus to reflect the portion of the measurement beam. D4 The apparatus of any of D1-D3, wherein the waveplate has an entrance surface that receives the measurement beam from the imaging lens and an exit surface opposite the entrance surface, wherein the entrance surface is situated proximate the measurement beam focus to reflect the portion of the measurement beam. D5 The apparatus of any of D1-D4 wherein one of the entrance surface and the exit surface of the waveplate includes an antireflection coating situated to receive the measurement beam from the imaging lens and the other of the entrance surface and the exit surface has an uncoated portion situated to receive the focused measurement beam from the imaging lens. D6 The apparatus of any of D1-D5, further comprising a polarizing beam splitter situated to couple the measurement beam to the waveplate. D7 The apparatus of any of D1-D6, further comprising a polarizing beam splitter situated to receive the measurement beam from the imaging lens and couple the measurement beam to the waveplate. D8 The apparatus of any of D1-D7, wherein the optical element having the surface situated proximate the measurement beam focus is a polarizing beam splitter (PBS). D9 The apparatus of any of D1-D8, wherein the PBS has an exit surface facing the objective and the exit surface of the PBS is situated at the measurement beam focus to reflect the portion of the measurement beam back into the optical fiber as the local oscillator beam. D10 The apparatus of any of D1-D9, wherein the optical element includes a polarizing beam splitter and a waveplate secured to the polarizing beam splitter (PBS). D11 The apparatus of any of D1-D10, wherein the PBS has an entrance surface coupled to receive the measurement beam from the optical fiber and the waveplate includes an exit surface situated to couple the measurement beam from the PBS to the objective and to reflect the portion of the measurement beam back into the optical fiber as the local oscillator beam. D12 The apparatus of any of D1-D11, wherein the PBS is situated to reflect a probe beam portion of the measurement beam to the waveplate. D13 The apparatus of any of D1-D12, further comprising: an optical detector coupled to the optical fiber and situated to receive the probe beam and the local oscillator beam and produce a heterodyne electrical signal; and a detection system that provides a target distance estimate base on the heterodyne electrical signal. D14 The apparatus of any of D1-D13, further comprising: first and second measurement beam sources that produce first and second measurement beams at first and second wavelengths, respectively; and a beam combiner that receives the first and second measurement beam sources and couples the first and second measurement beams to form a combined measurement beam, wherein the optical fiber directs the combined measurement beam to the imaging lens and the optical element reflects a portion of the combined measurement beam back towards the optical fiber as first and second local oscillator beams. D15 The apparatus of any of D1-D14, further comprising: first and second optical detectors coupled to the optical fiber or a polarizing beam splitter to receive a portion of the probe beam returned from a target and the first and second local oscillator beams so as to produce first and second heterodyne electrical signals; and a detection system that provides a target distance estimate based on the first and second heterodyne electrical signals. D16 The apparatus of any of D1-D15, wherein the mixing lens receives a measurement beam and a tracer beam from the optical fiber, and further comprising a dichroic filter situated on an axis of the objective lens, wherein the dichroic filter is non-transmissive to the tracer beam. D17 B13 The apparatus of D1 in combination with one or more or all of the features of any of D2-D17. E1 A method, comprising: directing a tracer beam to a beam splitter, the tracer beam having an associated beam numerical aperture; blocking a portion of the tracer beam numerical aperture so that the beam splitter receives the measurement beam and a partially obscured tracer beam; directing the partially obscured tracer beam from the beam splitter to a target with an objective lens; and receiving an imaging beam with the beam splitter and directing the imaging beam to an imaging detector with the beam splitter, wherein an obscured portion of the tracer beam corresponds to the imaging detector. F1 A method, comprising: focusing a measurement beam from an optical fiber to a measurement beam focus; and reflecting a portion of the measurement beam towards the optical fiber to produce a local oscillator beam. F2 The method of F1, wherein the measurement beam is focused through a beam splitter to an optical element having a surface that reflects the portion of the measurement beam to the optical fiber. F3 The method of F1-F2, wherein the optical element is a waveplate and the reflective surface is a surface of the waveplate. F4 The method of any of F1-F3, wherein the optical element is a polarizing beam splitter (PBS) and the reflective surface is a surface of the PBS. F5 The method of F1 in combination with one or more or all of the features of any of F2-F12. G1 An apparatus, comprising: a laser radar situated to direct a probe beam to a target along an axis and produce an estimate of a least one target dimension, the laser radar comprising a probe beam scanner coupled to scan the probe beam axis; an imager optically situated along the axis to produce an image of the target, wherein the probe beam scanner is coupled to the imager so as to direct the probe beam to a target location based on at least one feature identified in a target image. G2 The apparatus of G1, wherein the imager is an image sensor, and further comprising an image processor that identifies the at least one feature in the target image. G3 The apparatus of G1-G2, wherein the at least one feature is a design feature, and the target location is associated with the design feature. G4 The apparatus of G1-G3, wherein the at least one feature is a tooling ball or an eyeball sphere, and the target location is determined based on the location of the tooling ball or eyeball sphere. G5 The apparatus of G1-G4, where the target location is determined based on the location of the eyeball sphere. G6 The apparatus of G1 in combination with one or more or all of the features of any of G2-G6. H1 An apparatus, comprising: a laser radar situated to direct a probe beam to a target along an axis, the laser radar comprising a probe beam scanner coupled to scan the probe beam axis; an imaging system comprising an image sensor optically situated along the axis to produce an image of the target and a focus mechanism coupled to an objective lens to adjust a focus of the target image at the image sensor; and an image processor coupled to the imaging system to produce an estimate of at least one target dimension based on the image of the target and an estimate of a distance to the target. H2 The apparatus of H1, wherein the laser radar is configured to produce the estimate of the distance to the target. H3 The apparatus of any of H1-H2, wherein the estimate of the distance to the target is based on an adjustment of the focus mechanism. H4 The apparatus of any of H1-H3, wherein the focus mechanism is an autofocus mechanism. H5 The apparatus of any of H1-H4, wherein the imager is an image sensor, and further comprising an image processor that identifies the at least one feature in the target image. H6 The apparatus of any of H1-H5, wherein the at least one feature is a design feature, and the target location is associated with the design feature. H7 The apparatus of any of H1-H6, wherein the at least one feature is a tooling ball or an eyeball sphere, and the target location is determined based on the location of the tooling ball or eyeball sphere. H8 The apparatus of any of H1-H7, where the target location is determined based on the location of the eyeball sphere. H9 The apparatus of any of H1-H8, wherein the imaging system is configured to produce a plurality image portions, and the image processor is configured to stitch the plurality of image portions into a common image. H10 The apparatus of any of H1-H9, wherein the image processor is configured to at least partially compensate distortion in at least one image portion. H11 The apparatus of any of H1-H10, wherein the image processor is configured to at least partially compensate distortion in at least one image portion based on test grid images. H12 The apparatus of H1 in combination with one or more or all of the features of any of H2-H11. I1 A measurement apparatus, comprising: a laser radar that provides a scannable laser probe beam; a remote mirror system that includes a translatable mirror, wherein the laser radar is configured to direct the scannable laser probe beam to the translatable mirror of the remote mirror system to be reflected to a target to measure at least one feature of the target. I2 The measurement apparatus of I1, wherein the laser radar is situated to direct the scannable laser probe beam to the remote mirror system and determine a location of the remote mirror system, wherein the target the at least one feature of the target is measured based on the remote mirror system location and a portion of the probe beam returned from the target to the laser radar. I3 The measurement apparatus of I1-I2, wherein the remote mirror system includes at least one tooling ball or eyeball sphere, and the laser radar is situated to direct the scannable laser probe beam to the at least one tooling ball or eyeball sphere to determine the location of the remote mirror system. I4 The measurement apparatus of I2-I3, wherein the laser radar is coupled to the remote mirror system to initiate adjustment of the translatable mirror so that the scannable laser probe beam is directed to the at least one feature of the target. I5 The apparatus of I1 in combination with one or more or all of the features of any of I2- I4.