DUAL LASER SLOPE ANGLE MEASURING DEVICE
20230204353 · 2023-06-29
Inventors
Cpc classification
G01S17/48
PHYSICS
G01C7/04
PHYSICS
G01S7/4802
PHYSICS
International classification
Abstract
Example embodiments of the described technology provide apparatus and methods for measuring slope angles of surfaces. An example apparatus for measuring a slope angle of a surface may comprise a first laser and a second laser. The first and second lasers may be separated from one another and may be configured to emit parallel light beams towards the surface. The apparatus may also comprise at least one image sensor operable to capture images of the light beams scattered from the surface. The apparatus may also comprise at least one lens positioned to collect the light beams scattered from the surface and focus the scattered light beams onto the at least one image sensor. A controller may be configured to switch one of the first laser and the second laser off to avoid optical interference or cross-talk of the light beams emitted from both the first and second lasers at the at least one image sensor. Additionally, or alternatively, beams emitted by one or both of the lasers may be conditioned to avoid optical interference or cross-talk.
Claims
1. An apparatus for measuring a slope angle of a surface, the apparatus comprising: a first laser and a second laser, the first and second lasers separated from one another and configured to emit parallel light beams towards the surface; an image sensor operable to capture images of the light beams scattered from the surface; at least one lens positioned to collect the light beams scattered from the surface and focus the scattered light beams onto the image sensor; and a controller configured to determine the slope angle of the surface from the images captured by the image sensor.
2. The apparatus of claim 1 wherein the controller is further configured to switch one of the first laser and the second laser OFF or to condition at least one of the light beams emitted from the first and second lasers to avoid optical interference or cross-talk of the light beams emitted from both the first and second lasers at the image sensor.
3. The apparatus of claim 2 comprising a first switch operable to switch delivery of power to the first laser ON and OFF and a second switch operable to switch delivery of power to the second laser ON and OFF, the first and second switches connected to receive a control signal from the controller.
4. The apparatus of claim 3 wherein the first and second switches comprise electrical transistors.
5. The apparatus of claim 4 wherein the first and second switches comprise MOSFET transistors.
6. The apparatus of claim 3 wherein the controller is configured to control the first and second switches to switch delivery of power to the first and second lasers ON for less time than the first and second switches power OFF delivery of power to the first and second lasers.
7. The apparatus of claim 3 wherein the controller increases brightness of the first or second laser by powering the first or second laser ON for a longer amount of time.
8. The apparatus of claim 3 wherein the controller decreases brightness of the first or second laser by powering the first or second laser ON for a shorter amount of time.
9. The apparatus of claim 2 further comprising a dynamically variable element positioned in an optical path extending between the first and the second lasers and the image sensor, the dynamically variable element controllable by the controller to vary at least one parameter of the light beam emitted by the first laser or the light beam emitted by the second laser to avoid optical interference or cross-talk of the light beams emitted from both the first and second lasers at the image sensor.
10. The apparatus of claim 9 wherein the dynamically variable element comprises an adjustable filter.
11. The apparatus of claim 9 wherein the dynamically variable element comprises a spatial light modulator, an amplitude modulator or a phase modulator.
12. The apparatus of claim 3 wherein the controller dynamically varies the switching times of the first and second switches to match a speed at which the apparatus is travelling at relative to the surface.
13. The apparatus of claim 12 wherein the controller dynamically varies the switching times of the first and second switches to match a speed at which a vehicle to which the apparatus is coupled to is travelling relative to the surface.
14. The apparatus of claim 1 wherein the controller is configured to perform error detection to verify the accuracy of the readings measured by the image sensor.
15. The apparatus of claim 14 wherein upon detection of an error the controller replaces a faulty reading measured by the image sensor with a value that is an average of two or more previous readings.
16. The apparatus of claim 1 comprising an optical encoder, the optical encoder configured to initiate acquisition of data by the image sensor.
17. The apparatus of claim 1 comprising a line-generating lens positioned in front of each of the first and second lasers.
18. The apparatus of claim 17 wherein the line-generating lens comprises a Powell lens.
19. The apparatus of claim 1 wherein the first and second lasers are rotatable about a vertical axis.
20. The apparatus of claim 19 wherein the first and second lasers are rotatable 45° about the vertical axis, wherein 0° is defined as a line that extends parallel to a transverse axis.
21. The apparatus of claim 1 further comprising an inclination measuring device to measure an angle between the apparatus and a horizontal plane of Earth.
22. An apparatus for measuring a slope angle of a surface, the apparatus comprising: a first laser and a second laser, the first and second lasers separated from one another and configured to emit parallel light beams towards the surface; at least one image sensor operable to capture images of the light beams scattered from the surface; at least one lens positioned to collect the light beams scattered from the surface and focus the scattered light beams onto the at least one image sensor; and a controller configured to: switch one of the first laser and the second laser OFF or to condition at least one of the light beams emitted from the first and second lasers to avoid optical interference or cross-talk of the light beams emitted from both the first and second lasers at the at least one image sensor; and determine the slope angle of the surface from the images captured by the at least one image sensor.
23. The apparatus of claim 22 comprising two image sensors comprising a first image sensor positioned to capture images of light corresponding to the first laser and a second image sensor positioned to capture images of light corresponding to the second laser.
24. A method for measuring slope angle of a surface, the method comprising: providing a slope angle measuring apparatus comprising: a first laser and a second laser, the first and second lasers separated from one another and configured to emit parallel light beams towards the surface; at least one image sensor operable to capture images of the light beams scattered from the surface; and at least one lens positioned to collect the light beams scattered from the surface and focus the scattered light beams onto the at least one image sensor; switching one of the first laser and the second laser OFF or to condition at least one of the light beams emitted from the first and second lasers to avoid optical interference or cross-talk of the light beams emitted from both the first and second lasers at the at least one image sensor; and determining the slope angle of the surface from the images captured by the at least one image sensor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0039] The accompanying drawings illustrate non-limiting example embodiments of the invention.
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
DETAILED DESCRIPTION
[0047] Throughout the following description, specific details are set forth in order to provide a more thorough understanding of the invention. However, the invention may be practiced without these particulars. In other instances, well known elements have not been shown or described in detail to avoid unnecessarily obscuring the invention. Accordingly, the specification and drawings are to be regarded in an illustrative, rather than a restrictive sense.
[0048] One aspect of the technology described herein provides a device configured to measure slope angle (or angles) as a quantity to provide information about a surface. Where the surface is a road, highway or runway, the slope angle information may be acquired along a line that is oriented in the longitudinal direction of travel of a vehicle to which the device (comprising a dual laser slope angle measuring arrangement) may be coupled. While it is known to use sensors that directly measure vertical distance to acquire an elevation profile of a surface, direct measurement of slope angle using closely spaced parallel lasers may have significant advantages over known technologies. The slope angle measurement device described herein provides improved apparatus and methods for acquiring high resolution slope angle profiles (e.g. pavement profiles) while operating at varying velocities and down to and including zero velocity.
[0049] The slope angle measurement device comprises a pair of lasers and may be configured to directly measure slope angle over very small distances, as determined by laser spacing L, by closely integrating and aligning optics that may use a single optical sensor to capture object images formed when beams emitted from both lasers strike a target surface and by alternate switching of the lasers to eliminate optical interference and signal crosstalk and by satisfying the Scheimpflug condition for both lasers to ensure focus is maintained over a suitably large range of operation. Slope angle may be acquired by measuring front and rear displacements of the lines formed by line lasers, when they contact a target surface, from a zero-position reference typically referred to as the Pavement Reference Line in pavement profile applications.
[0050] The pavement slope angle application for profile measurement may depend on translational motion between the slope angle measuring device and a target surface to be measured. In other applications, the device may be stationary and measure changes of orientation of a substantially planar object or a rotating object that is measured parallel with the rotational axis.
[0051] More generally, a compact, integrated laser slope angle measuring device may enable new applications currently rendered impractical. Such device may be of particular value in both the contract management of new surface construction and as a reference standard for certification of other instruments.
[0052] In one particular embodiment a slope angle measuring apparatus as described herein may be used to profile a pavement surface. The described slope angle measuring apparatus, however, is not limited to use for profiling a pavement surface and may have other applications, such as measuring slopes of surfaces in raw materials processing, manufacturing processes, road and runway construction, road and runway maintenance, road and runway inspection, road and runway quality control testing and surveying, etc.
[0053]
of a target surface 101. In many respects, slope angle measurement devices 10A, 10B, 10C are similar to one another.
[0054] Referring to
[0055] Two parallel lasers 13, 14 may be mounted to frame 11. Parallel lasers 13, 14 may comprise line-generating lenses 15, 16 fitted to an output of each of lasers 13, 14. Either one (e.g. one of lens 17 or 18) or two lenses 17, 18 may be positioned within an optical path of lasers 13, 14 and line-generating lenses 15, 16. The optical paths of lasers 13, 14, lenses 15, 16 and lenses 17,18 may be aligned with window-like openings 12 and optical image sensors 19, 20. As described elsewhere herein, in some embodiments, devices 10 comprise a single image sensor (e.g. one of image sensor 19 or image sensor 20) which can detect light corresponding to both lasers 13 and 14. Example device 10A shown in
[0056] Line-generating lenses 15, 16 may be of Powell or other lens type suitable to produce a narrow fan radiation pattern resulting in a straight line on intersection with a smooth level planar target surface. The fan generated by lenses 15, 16 may have a uniform light intensity across the fan angle and resulting line. Lasers 13, 14 may include filters or polarizers as appropriate to improve performance, depending on application.
[0057] In some embodiments, one or more filters or polarizers may condition light emitted from lasers 13, 14. For example, one or more filters may set a wavelength range of the light emitted from lasers 13, 14 (e.g. visible red, green, infrared, etc.). Additionally, or alternatively, one or more filters may block (or at least partially reduce the presence of) unwanted light (e.g. sunlight from outdoor environments, fluorescent light from indoor applications, etc.). In some embodiments, filtering properties of the one or more filters are dynamically varied. For example, an adaptive filter may continue to block unwanted sunlight even as the light spectrum changes (e.g. from dawn to dusk, changing weather, etc.). In some embodiments, filtering properties are varied to improve a signal to noise ratio of the detected light beams from lasers 13, 14. In some embodiments, light emitted from each of lasers 13, 14 is filtered differently to generate two uniform (or identical) beams.
[0058] Lenses 17, 18 may include focusing apparatus, aperture apparatus for control of exposure, filters and/or polarizers as appropriate to improve performance depending on application. Example variations include: [0059] adjusting exposure for varying environmental conditions; [0060] increasing power of lasers 13, 14 (or of the light beams emitted by lasers 13, 14) in brighter settings; [0061] decreasing power of lasers 13, 14 (or of the light beams emitted by lasers 13, 14) in darker settings; [0062] etc.
In some embodiments, such variations are made in real time. In some embodiments, the same variations are made to both of lasers 13, 14 and/or the light beams emitted by lasers 13, 14. In some embodiments, different variations are made to laser 13 and/or the light beam emitted by laser 13 than laser 14 and/or the light beam emitted by laser 14.
[0063] In some embodiments lenses 17 and 18 are combined into a single lens.
[0064] Optical image sensors 19, 20 may, for example, comprise high speed Complementary Metal Oxide Semiconductor (CMOS) sensors or Charge-Coupled Device (CCD) sensors. In some embodiments, image sensors 19, 20 have a pixel resolution of approximately 4046×2048, with the longer 4096 pixel side of the die oriented to acquire z-axis image information, and the 2048 pixel side of the die oriented to acquire x-axis image information (see axes illustrated in
[0065] Frame 11 may also support (and optionally enclose parts or all of) a circuit module 21. A schematic depiction of circuit module 21 is shown in
[0066] The external data communication interface may, for example, be a commercially available protocol and interface such as USB2, USB3, Ethernet using an RJ-45 connector and/or the like. The external data communication interface may comprise any suitable wireless communication protocol and interface, such as WiFi, Bluetooth™ and/or the like.
[0067] Power supply and data communications may be connected to slope angle measuring device 10 using an electrical connector or connectors 22.
[0068] In some embodiments, slope angle measuring devices 10 optionally comprise an inclination measuring device (e.g. an inclinometer) 23 to measure the angle α between the frame 11 (and/or the frame of the vehicle (not shown) to which the slope angle measuring device 10 may be coupled) and the horizontal plane of the earth. In some embodiments inclination measuring device 23 is enclosed by frame 11 (e.g. inclination measuring device 23 is located internal to frame 11). In some embodiments inclination measuring device 23 is externally mounted to frame 11 or to a frame of a vehicle to which device 10 is coupled. In some such embodiments, inclination measuring device 23 may be coupled to device 10 with connector(s) 22.
[0069] Inclination measuring device 23 may comprise an inclinometer based on an accelerometer or a gyroscopic device capable of providing the orientation of the frame 11 with respect to the horizontal plane of the earth. In some embodiments (depending on application), inclination measuring device 23 comprises a MicroElectroMechanical System (MEMS) device. In some embodiments, inclination measuring device 23 comprises a MEMS gyroscope and an accelerometer (e.g. a three axis accelerometer). Such embodiments may be particularly advantageous for cases where a vehicle (to which device 10 is coupled) is subject to stop and go traffic. When the vehicle is stopped, inclination measuring device 23 would measure a zero tilt (pitch) using its accelerometers and, when the vehicle is in motion, the gyroscope may be configured to measure a dynamic pitch value by integrating a roll rate.
[0070] An optical encoder device 24 may be coupled to an axle of a wheel supporting a vehicle frame or an axle of a wheel coupled to the vehicle frame, to produce pulses corresponding to rotation of the axle or wheel. Encoder 24 may in turn provide these pulses to circuit module 21 which may be configured to interpret the rotational pulses output from encoder 24 into a corresponding longitudinal distance. In some embodiments, optical encoder device 24 is rotationally coupled to the axle of the wheel. The pulses of encoder 24 corresponding to rotation of the wheel may be used to measure a distance travelled, to trigger acquisition of data and/or data processing which may be controlled by processor 21A of printed circuit module 21 and/or the like.
[0071] In a particular embodiment of device 10, two lasers 13, 14 are mounted to frame 11, with their optical axes in parallel, both with their optical axes in vertical or z-axis alignment emitting downward in the case of the pavement surface profile measurement application. One of lasers 13, 14 may be mounted in a front position (e.g. laser 13) and the other one of lasers 13, 14 may be mounted in a rear position (e.g. laser 14) of device 10, where a front position may refer to a position having a greater y-axis coordinate and a rear position may refer to a position having a lesser y-axis coordinate, where y is a direction of travel of the vehicle to which device 10 is mounted. L (see e.g.
[0072] As described above, device 10 comprises two lasers (e.g. lasers 13, 14). In some embodiments, the two lasers are a small distance apart (e.g. less than about 0.1 m). Due to the close physical integration of device 10, lenses 17 and/or 18 and image sensors 19 and/or 20 may be able to “see” both object lines corresponding to both lasers 13, 14 simultaneously, potentially resulting in optical interference or crosstalk, or difficulty in resolving which line image observed by the image sensors 19, 20 is associated with which laser.
[0073] In the example embodiment of device 10A shown in
[0074] Such potential optical interference problems may be mitigated by turning on one laser 13, 14 at a time and capturing each line image on its corresponding image sensor 19 and/or 20 separately.
[0075]
[0076] In some embodiments laser switches 30, 31 comprise transistors. For example, laser switches 30, 31 may comprise high speed MOSFETs. The transistors may, for example, be controlled by outputs from processor 21A or circuit module 21. Laser switches 30, 31 and circuit module 21 may be configured to compensate for any known latencies of lasers 13, 14 and/or laser switches 30, 31.
[0077] In some embodiments, laser 13 emits a beam having different properties than the beam emitted by laser 14 (e.g. different wavelength, polarization and/or the like). One or more filters, polarizers, etc. positioned in the optical path in front of sensor 19 and/or sensor 20 may be selectively configured to block light from one of lasers 13 or 14 at any given time to avoid optical interference.
[0078] In some embodiments, only one of sensors 19 and/or 20 potentially experiences optical interference. In some such embodiments, only one of lasers 13 or 14 is switched OFF and ON accordingly to avoid the potential optical interference.
[0079] In some embodiments, processor 21A and/or circuit module 21 is configured to perform an error detection method (e.g. comparison to past readings, comparison to a threshold value, etc.) to verify the accuracy of the readings measured by sensors 19 and/or 20. If a faulty reading is detected (i.e. an error is detected), the faulty reading may be replaced by a value that is the average of two or more previous readings.
[0080] In some currently preferred embodiments, lasers 13 and 14 are switched such that lasers 13 and 14 are powered ON as little as possible to preserve the life of laser 13 and 14 and conserve power resources. In some embodiments lasers 13 and 14 are switched ON with a duty cycle that is less than 5%. In some embodiments lasers 13 and 14 are switched ON for at least a minimum threshold amount (e.g. less than 5% duty ON) every time device 10 moves a threshold distance (e.g. distance L).
[0081] In some embodiments lasers 13 and 14 are powered ON to continuously make measurements. In some embodiments lasers 13 and 14 are switched ON with a duty cycle of 100% or close to 100%.
[0082] In some embodiments, processor 21A dynamically varies the switching times (e.g. varies how long each of lasers 13 and 14 is ON for and how long lasers 13 and 14 are OFF). In some embodiments, the switching times are varied to improve measurement quality (e.g. increase signal to noise ratio by increasing ON time in bright environments and increase power efficiency and/or laser functional lifetime by decreasing ON time in low light environments). In some embodiments, processor 21A reduces a switching frequency, if measurements are remaining constant (e.g. varying by less than about 10%, 5%, 1% and/or the like of previous values). In some embodiments, processor 21A increases a switching frequency if measurements are varying (e.g. varying by more than about 10%, 5%, 1% and/or the like of previous values).
[0083] In some embodiments, processor 21A dynamically varies the switching times to match a speed of a vehicle to which device 10 is coupled (e.g. to maintain a desired distance between measurements).
[0084] In some embodiments, one or both of the beams emitted from lasers 13, 14 are modulated (e.g. amplitude modulated, spatially modulated, phase shifted and/or the like) to differentiate the light beam emitted from laser 13 from the light beam emitted from laser 14. In some embodiments, device 10 comprises a spatial light modulator, a phase modulator, etc. in the optical path of lasers 13 and/or 14.
[0085] Depending on the application, this alternate switching of the lasers and image capturing may need to occur rapidly, for example in less than one millisecond, particularly where the application involves high velocity of relative longitudinal motion between the slope angle measuring device 10 and the target pavement such as in a pavement profile surveying application where the longitudinal direction velocity may be on the order of 75 miles/hour or 33.5 meters/second.
[0086] With reference to
[0087] Light rays may be traced to apply the principle of optical triangulation to measurements of the front laser (e.g. laser 13) to relate the front change in elevation L.sub.f with the corresponding distance s.sub.f (the length of the line segment IQ) on the front image sensor (e.g. image sensor 19). Likewise, measurements of the rear laser (e.g. laser 14) may be used to relate the rear change in elevation L.sub.r with the corresponding distance s.sub.r on the rear image sensor (e.g. image sensor 20) (not shown in
[0088] One factor to consider is the optical alignment of device 10, that is, the angles, distances, focal length f and magnification M of the optical elements of device 10. These may be constants given that the optical elements are preferably constructed in a fixed configuration within frame 11 as a single rigid unit and move as a unit in the y or z-axis directions. To measure the slope angle
of a target surface 101, device 10 may measure object elevation displacements L.sub.f and L.sub.r which are displacements from the Pavement Reference Line 104.
[0089] First considering L.sub.f, a reference point P where a light beam from the front laser (e.g. laser 13) contacts the Pavement Reference Line 104 may be used for geometric alignment of the front optical apparatus, such that a line PCI through the axis of front lens (e.g. lens 17) intersects point P at an angle χ with the optical axis (z-axis) of front laser (e.g. laser 13). The front image sensor (e.g. image sensor 19) may form an angle δ with the line PCI. Lens axis line PCI may intersect the front image sensor (e.g. image sensor 19) at point I. If radiation from the front laser (e.g. laser 13) contacts target surface 101 at point A, then the light ray ACQ may strike the front image sensor (e.g. image sensor 19) at point Q. L.sub.o is the length along the lens axis line PCI from the object point P to the center of the front lens (e.g. lens 17). L.sub.L is the length along the lens axis line PCI from the image point I to point C at the center of the front lens (e.g. lens 17).
[0090] To obtain a formula representation (e.g. L.sub.f=f(s) where f(⋅) is a function) for the optical elements of device 10, distance s (s.sub.f (for front laser), s.sub.r (for rear laser)) may be determined on a surface of the front image sensor (e.g. image sensor 19) that corresponds to a pavement elevation increase seen by the front laser (e.g. laser 13) that causes the spot on the surface of target surface 101 to move from P to A resulting in a change of image location on the image sensor 19 to move from I to Q. In
[0091] Firstly, some similar right-angle triangles may be constructed. Line AB may be drawn at a right angle from lens axis line PCI. Line QD may be drawn at a right angle to an extension of lens axis PCI. This results in similar right triangles ABC and QDC. Given that sides AB and QD are corresponding sides and BC and DC are corresponding sides, their relationship may be represented, for example, as:
substituting the above representation yields the following representation:
solving by cross multiplication yields the following representation:
L.sub.fL.sub.L sin χ+L.sub.fs.sub.f sin χ cos δ=L.sub.os.sub.f sin δ−L.sub.fs.sub.f sin δ cos χ
collecting terms to solve for L.sub.f yields the following representation:
L.sub.f(L.sub.i sin χ+s.sub.f(sin χ cos δ+sin δ cos χ)=L.sub.os.sub.f sin δ
and using trigonometric sum and difference formula yields the following representation:
[0092] Similarly, L.sub.r may be derived from the rear laser (e.g. laser 14), the rear lens (e.g. lens 18) and the rear image sensor (e.g. image sensor 20). L.sub.o, L.sub.i, χ, and δ are typically identical or have a similar value as for the front laser (e.g. laser 13). Some embodiments of slope angle measuring device 10, for example device 10A shown in
[0093] Variables on the right side of the equation above (e.g. L.sub.o, L.sub.i, χ, and δ) may be constants (perhaps subject to calibration) except for s.sub.f, since the geometry of device 10 may be fixed and applicable to the full range of measurements. This is demonstrated by the following equations where k.sub.1, k.sub.2 and k.sub.3 are constants.
Similarly:
[0094]
The rear laser measurement of L.sub.r is not shown on
There may be a non-linear relationship between L.sub.f and s.sub.f.
Solving for the slope m of target surface 101:
where Pavement Reference Line 104 is considered to be z=0, L.sub.f and L.sub.r take on positive values when higher than z=0 and negative values when lower than z=0, s.sub.f is an image displacement from Pavement Reference Line 104 of the front laser (e.g. laser 13) on the front image sensor (e.g. image sensor 19) and s.sub.r is an image displacement from Pavement Reference Line 104 of the rear laser (e.g. laser 14) on the rear image sensor (e.g. image sensor 20) or the front image sensor (e.g. image sensor 19) if device 10 comprises only a single image sensor. As discussed elsewhere herein, in some embodiments, s.sub.f and s.sub.r are captured by a single image sensor.
[0095] Substituting for s.sub.f and s.sub.r may yield the following representation:
[0096] If the thin lens equation is satisfied, then the image I of the point P will be in focus. For example:
[0097] However, as the front laser (e.g. laser 13) light beam contact point A on the pavement ranges to values both higher and lower than P along the z axis, depending on depth of field, the corresponding image point I may not be in focus, particularly since the distance between point A and the center of the lens may vary over a wide range. Accurate measurement of slope angle f may depend on sharp focus of the image point Q on the image sensor over the full Measurement Range 100 (e.g. as described elsewhere herein and may conveniently be + and −0.105 m height in the direction of the z-axis), around the Pavement Reference Line 104 at 0 m. Therefore, the optical apparatus may be arranged to ensure focus is maintained over the full Measurement Range 100.
[0098] In a simple optical arrangement, the object plane, lens plane and image plane are all parallel. However, for device 10, object planes (front object plane 110 and rear object plane 112) may be rotated with respect to the other planes, since the purpose of device 10 is to measure along an axis that results in changing distance between the object and the lens. Scheimpflug teaches that, if the object plane 110 is rotated relative to the lens plane 108, then the image plane 106 should also be rotated relative to the lens plane 108 to maintain focus over the object plane 110. More specifically, the object plane 110, lens plane 108 and image plane 106 should all intersect in a single line called the Scheimpflug Line, or Scheimpflug's “axis of collineation”. In
[0099] Scheimpflug, in U.S. Pat. No. 751,347, states:
where r is the length of the line CS from center of a front lens (e.g. lens 17) to the Scheimpflug Line represented as point S since it is parallel to the x axis. ϵ is the angle between lens plane 108 bisecting the front lens (e.g. lens 17) and front object plane 110 of the fan formed by the front line-generating lens (e.g. lens 15) from the collimated cylindrical light beam emitted from the front laser (e.g. laser 13). ϕ is the angle between lens plane 108 bisecting the front lens (e.g. lens 17) and image plane 106 of the front image sensor (e.g. image sensor 19).
[0100] The Scheimpflug principle may be satisfied when:
[0101] Now restating using angles χ and δ of the non-similar right triangles SCP and SCI yields the following representation:
[0102] The optical apparatus of device 10 is preferably aligned such that this condition is satisfied—that is, k.sub.x preferably satisfies this condition for both front and rear lasers, and in particular where a single lens or a single image sensor is employed to measure both L.sub.f and L.sub.r.
[0103] Magnification M may given by:
[0104] Magnification may be determined for the largest value of L.sub.f given the nonlinear relationship between L.sub.f and s.sub.f (this is preferred in some cases). M may have a negative value because the image s.sub.f may be inverted relative to object L.sub.f.
[0105] The optics of device 10 may be designed by selecting appropriate values of Setback Distance 102 (conveniently, in some embodiments, set to 0.300 m or some other suitable distance) to Pavement Reference Line 104 which is the center of the Measurement Range 100 (conveniently, in some embodiments, 0.200 m). χ, δ, f, L.sub.o, L.sub.1 and M are typically largely interrelated, so trade-offs (e.g. optic parameter trade-offs) may be desirable.
[0106] Device 10 may commence acquiring data based on a trigger signal. The trigger signal may be based on intervals of constant time Δt, such as 1 millisecond for example, or of constant distance Δd, such as 1 millimeter for example. A suitable trigger signal, by way of example, may be based on pulses (e.g. a suitable number of pulses) received from an optical encoder device 24 mechanically coupled to the axle of a wheel of a vehicle to which frame the subject slope angle measuring device 10 is mounted. The rotation of the shaft of the optical encoder may produce pulses of the rotation of the wheel, and therefore of the longitudinal distance travelled, by dividing the accumulated number of pulses by a scaling factor that converts the accumulated number of pulses to a longitudinal distance travelled. In some embodiments the trigger signal also initiates switching of lasers 13, 14 as described elsewhere herein.
[0107] When a predetermined or configurable (e.g. user-configurable) trigger time Δt or distance Δd threshold is reached, a processor (e.g. the processor 21A of circuit module 21) may trigger the execution of a subroutine resulting in the following example steps which may measure information in respect of target surface 101. Such information, may comprise, for example, slope angle
of target surface 101 and, optionally, other characteristics of target surface 101 such as elevation change ΔE.sub.n, total accumulated elevation E.sub.N, and/or the like. In some embodiments, such a measurement sub-routine may comprise some or all of the following steps: [0108] 1. Acquire raw data for both front and rear lasers from measuring device(s) (e.g. a device 10 having one or more image sensors such as imaging sensor 19 and/or 20) using input hardware interfaces. The data may be acquired substantially simultaneously, for example within one millisecond, so that relative motion between the slope angle measuring device 10 and the target pavement surface does not influence geometry of the measurement and so that the highest accuracy measurement may be made. This maximum threshold timing between measurements may depend on the y-direction speed that a vehicle supporting device 10 is moving. An angle α may additionally be acquired or measured from the inclination measuring element (e.g. inclination measuring device 23). For example, an angle α may be acquired if required for pavement profile measurement. [0109] If there is potential optical interference and/or signal crosstalk or if a single image sensor (e.g. image sensor 19 or 20) is used for measuring both the front and rear lasers (e.g. lasers 13 and 14 respectively), the sub-routine may comprise measuring using front and rear lasers separately by alternately switching ON one laser at a time: [0110] a. If measuring L.sub.f, switch ON the front laser (e.g. laser 13) and switch OFF the rear laser (e.g. laser 14). [0111] b. If measuring L.sub.r, switch OFF the front laser (e.g. laser 13) and switch ON the rear laser (e.g. laser 14). [0112] An image may, for example, be recorded on the front image sensor (e.g. image sensor 19) by performing the steps of: [0113] a. Setting the front lens (e.g. lens 17) aperture for proper exposure; [0114] b. Set focus if the front lens (e.g. lens 17) is so equipped; [0115] c. Operate “shutter” of the front image sensor (e.g. image sensor 19) by enabling pixel electrical charge accumulation. [0116] L.sub.f and L.sub.r may be acquired in sequence (L.sub.f and then L.sub.r or vice versa). In some embodiments one or more steps described below are first performed for one of L.sub.f and L.sub.r prior to being performed for the other one of L.sub.f and L.sub.r. In some embodiments L.sub.f is determined prior to L.sub.r being determined. In some embodiments L.sub.r is determined prior to L.sub.f being determined. In some embodiments at least one of the steps described below is performed simultaneously for both L.sub.f and L.sub.r. [0117] 2. Determine position in time and space by accumulating Δt to determine a current total time (T.sub.total) (T.sub.total=Σ.sub.i Δt.sub.1, where i is the number of iterations of the measurement subroutine) and accumulating Δd to determine a total longitudinal distance position (D.sub.total) (D.sub.total=Σ.sub.i Δd.sub.1). At and/or Δd may be accumulated or summed together by, for example, processor 21A. In some embodiments Δt and/or Δd may be accumulated in a specific memory location or register (e.g. an accumulator register). [0118] 3. With reference to
[0133] In some embodiments one or both of lasers 13, 14 are rotated by 45° relative to the direction of travel. Such rotation would be around a vertical line in the direction of the z-axis of
Example Application
[0134] In one example case a device 10 is coupled to an underside of a frame of a vehicle (see e.g.
[0135] Movement of the vehicle may initiate device 10 to commence measuring the surface profile (e.g. by a trigger signal being generated by optical encoder device 24). Additionally, or alternatively, the operator may commence measurement through external computer user interface 40.
[0136] In some embodiments device 10 provides feedback to the operator via external computer user interface 40 (e.g. a status of the measuring cycle, measurement parameters, detected errors, etc.).
Interpretation of Terms
[0137] Unless the context clearly requires otherwise, throughout the description and the [0138] “comprise”, “comprising”, and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to”; [0139] “connected”, “coupled”, or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements; the coupling or connection between the elements can be physical, logical, or a combination thereof; [0140] “herein”, “above”, “below”, and words of similar import, when used to describe this specification, shall refer to this specification as a whole, and not to any particular portions of this specification; [0141] “or”, in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list; [0142] the singular forms “a”, “an”, and “the” also include the meaning of any appropriate plural forms.
[0143] Words that indicate directions such as “vertical”, “transverse”, “horizontal”, “upward”, “downward”, “forward”, “backward”, “inward”, “outward”, “left”, “right”, “front”, “back”, “top”, “bottom”, “below”, “above”, “under”, and the like, used in this description and any accompanying claims (where present), depend on the specific orientation of the apparatus described and illustrated. The subject matter described herein may assume various alternative orientations. Accordingly, these directional terms are not strictly defined and should not be interpreted narrowly.
[0144] Embodiments of the invention may be implemented using specifically designed hardware, configurable hardware, programmable data processors configured by the provision of software (which may optionally comprise “firmware”) capable of executing on the data processors, special purpose computers or data processors that are specifically programmed, configured, or constructed to perform one or more steps in a method as explained in detail herein and/or combinations of two or more of these. Examples of specifically designed hardware are: logic circuits, application-specific integrated circuits (“ASICs”), large scale integrated circuits (“LSIs”), very large scale integrated circuits (“VLSIs”), and the like. Examples of configurable hardware are: one or more programmable logic devices such as programmable array logic (“PALs”), programmable logic arrays (“PLAs”), and field programmable gate arrays (“FPGAs”). Examples of programmable data processors are: microprocessors, digital signal processors (“DSPs”), embedded processors, graphics processors, math co-processors, general purpose computers, server computers, cloud computers, mainframe computers, computer workstations, and the like. For example, one or more data processors in a control circuit for a device may implement methods as described herein by executing software instructions in a program memory accessible to the processors.
[0145] Processing may be centralized or distributed. Where processing is distributed, information including software and/or data may be kept centrally or distributed. Such information may be exchanged between different functional units by way of a communications network, such as a Local Area Network (LAN), Wide Area Network (WAN), or the Internet, wired or wireless data links, electromagnetic signals, or other data communication channel.
[0146] For example, while processes or blocks are presented in a given order, alternative examples may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times.
[0147] In addition, while elements are at times shown as being performed sequentially, they may instead be performed simultaneously or in different sequences. It is therefore intended that the following claims are interpreted to include all such variations as are within their intended scope.
[0148] In some embodiments, the invention may be at least partially implemented in software. For greater clarity, “software” includes any instructions executed on a processor, and may include (but is not limited to) firmware, resident software, microcode, and the like. Both processing hardware and software may be centralized or distributed (or a combination thereof), in whole or in part, as known to those skilled in the art. For example, software and other modules may be accessible via local memory, via a network, via a browser or other application in a distributed computing context, or via other means suitable for the purposes described above.
[0149] Where a component (e.g. a software module, processor, assembly, device, circuit, etc.) is referred to above, unless otherwise indicated, reference to that component (including a reference to a “means”) should be interpreted as including as equivalents of that component any component which performs the function of the described component (i.e., that is functionally equivalent), including components which are not structurally equivalent to the disclosed structure which performs the function in the illustrated exemplary embodiments of the invention.
[0150] Specific examples of systems, methods and apparatus have been described herein for purposes of illustration. These are only examples. The technology provided herein can be applied to systems other than the example systems described above. Many alterations, modifications, additions, omissions, and permutations are possible within the practice of this invention. This invention includes variations on described embodiments that would be apparent to the skilled addressee, including variations obtained by: replacing features, elements and/or acts with equivalent features, elements and/or acts; mixing and matching of features, elements and/or acts from different embodiments; combining features, elements and/or acts from embodiments as described herein with features, elements and/or acts of other technology; and/or omitting combining features, elements and/or acts from described embodiments.
[0151] Various features are described herein as being present in “some embodiments”. Such features are not mandatory and may not be present in all embodiments. Embodiments of the invention may include zero, any one or any combination of two or more of such features. This is limited only to the extent that certain ones of such features are incompatible with other ones of such features in the sense that it would be impossible for a person of ordinary skill in the art to construct a practical embodiment that combines such incompatible features. Consequently, the description that “some embodiments” possess feature A and “some embodiments” possess feature B should be interpreted as an express indication that the inventors also contemplate embodiments which combine features A and B (unless the description states otherwise or features A and B are fundamentally incompatible).
[0152] It is therefore intended that the following appended claims and claims hereafter introduced are interpreted to include all such modifications, permutations, additions, omissions, and sub-combinations as may reasonably be inferred. The scope of the claims should not be limited by the preferred embodiments set forth in the examples, but should be given the broadest interpretation consistent with the description as a whole.