MACHINE LEARNING BASED ROADWAY STRIPING APPARATUS AND METHOD
20220042258 · 2022-02-10
Assignee
Inventors
- Douglas D. Dolinar (Souderton, PA, US)
- William R. Haller (Souderton, PA, US)
- Kyle J. Leonard (Souderton, PA, US)
- Eric M. Stahl (Souderton, PA, US)
- Charles R. Drazba (Souderton, PA, US)
- Matthew W. Smith (Souderton, PA, US)
Cpc classification
E01C23/163
FIXED CONSTRUCTIONS
G06V20/588
PHYSICS
E01C23/222
FIXED CONSTRUCTIONS
International classification
E01C23/16
FIXED CONSTRUCTIONS
Abstract
A system for striping a roadway marking onto roadway surface either for rehabilitating the current roadway marking or duplicating a previously recorded roadway marking with a computer having a machine learning program to process the roadway marking image and position the marker at the desired roadway marking location.
Claims
1. A roadway striping apparatus for replicating, on a resurfaced roadway surface, a pre-existing roadway mark located on the roadway surface prior to resurfacing, the apparatus comprising: a vehicle; an imager affixed to the vehicle configured to produce an image of the prior roadway surface including at least one pre-existing roadway mark located on the prior roadway surface; a GPS receiver affixed to the vehicle; a marker affixed to the vehicle and responsive to a dispensing signal for dispensing roadway marking material; a sensor responsive to the marker for determining the location of the marker; a computer responsive to the imager, GPS receiver and sensor having (a) a machine learning program for processing the image, (b) a program for determining the type of pre-existing roadway mark from the processed image, (c) a program for determining a best-fit roadway mark path from the GPS location of the processed image, (d) a program for producing an error signal based upon the location difference between the best-fit roadway mark path and the sensor, and (e) a program for producing a dispensing signal for replicating the processed pre-existing roadway mark along the best-fit roadway mark path, wherein, the computer is configured to: (a) recognize the pre-existing roadway mark within the image, (b) produce a dispensing signal for dispensing roadway marking material over the best-fit roadway mark path, and (c) produce an error signal based upon the lateral location difference between the best-fit roadway mark path and the marker position; and an actuator attached to the marker and responsive to the error signal and configured to position the marker over the best-fit roadway mark path.
2. The apparatus according to claim 1 wherein the actuator comprises a laterally moveable carriage.
3. The apparatus according to claim 2 wherein the imager is affixed to the carriage.
4. The apparatus according to claim 1, wherein the sensor is configured to process an image of an electromagnetic radiation source attached to the marker.
5. The apparatus according to claim 4 wherein the electromagnetic radiation source comprises either a laser or a laser line generator.
6. The apparatus according to claim 1, further comprising a deterministic timing source in communication with the computer, wherein the deterministic timing source synchronously or asynchronously time stamps images of the roadway surface.
7. The apparatus according to claim 1, wherein the machine learning network is a supervised machine learning network system configured to process illumination conditions of the roadway surface consisting of: shadows, color changes of the roadway surface, intersections, imager field of view variations, blending of the roadway mark into the roadway surface, and background clutter and noise.
8. The apparatus according to claim 7, wherein the supervised machine learning network system comprises a convolutional neural network.
9. An apparatus for placing layout indicia onto a resurfaced roadway surface, the apparatus comprising: a vehicle; an imager affixed to the vehicle configured to produce an image of a roadway surface prior to being resurfaced, the image including at least one pre-existing roadway mark located on the roadway surface prior to being resurfaced; a GPS receiver affixed to the vehicle; a marker affixed to the vehicle and responsive to a dispensing signal for dispensing layout indicia; a sensor responsive to the marker for determining the location of the marker; a computer responsive to the imager, GPS receiver and sensor having (a) a machine learning program for processing the image, (b) a program for determining the type of the pre-existing roadway mark from the processed image, (c) a program for determining a best-fit roadway mark path from the GPS location of the processed image, (d) a program for producing an error signal based upon the location difference between the best-fit roadway mark path and the sensor, and (e) a program for producing a dispensing signal for placing layout indicia onto the best-fit roadway mark path, wherein, the computer is configured to: (a) recognize the pre-existing roadway mark within the image, (b) produce a dispensing signal for dispensing layout indicia over the best-fit roadway mark path, and (c) produce an error signal based upon the lateral location difference between the best-fit roadway mark path and the marker position; and an actuator attached to the marker and responsive to the error signal and configured to position the marker over the best-fit roadway mark path.
10. The apparatus according to claim 9 wherein the actuator comprises a laterally moveable carriage.
11. The apparatus according to claim 10 wherein the imager is affixed to the carriage.
12. The apparatus according to claim 9, wherein the sensor is configured to process an image of an electromagnetic radiation source attached to the marker.
13. The apparatus according to claim 12 wherein the electromagnetic radiation source comprises either a laser or a laser line generator.
14. The apparatus according to claim 9, further comprising a deterministic timing source in communication with the computer, wherein the deterministic timing source synchronously or asynchronously time stamps images of the roadway surface.
15. The apparatus according to claim 9, wherein the machine learning network is a supervised machine learning network system configured to process illumination conditions of the roadway surface consisting of: shadows, color changes of the roadway surface, intersections, imager field of view variations, blending of the roadway mark into the roadway surface, and background clutter and noise.
16. The apparatus according to claim 15, wherein the supervised machine learning network system comprises a convolutional neural network.
17. A method for dispensing material from a marker onto a resurfaced roadway surface along a best-fit roadway mark, the method comprising: producing an image of a roadway surface prior to being resurfaced, the image including at least one pre-existing roadway mark located on the roadway surface prior to being resurfaced and a GPS location of the processed image; determining from the image with a machine learning program: (a) a type of the pre-existing roadway mark from the processed image and (b) a best-fit roadway mark path from the GPS location of the processed image; determining the marker position; producing with a machine learning program: (a) a dispensing signal for dispensing material from the marker over the best-fit roadway mark path and (b) an error signal based upon the lateral location difference between the best-fit roadway mark path and the marker position; positioning the marker over the best-fit roadway mark path using an actuator responsive to the error signal; and dispensing the material.
18. The method according to claim 17 wherein the material is selected from the group consisting of roadway marking material and layout indicia.
19. The method according to claim 17, further comprising filtering and compressing the image of the roadway surface.
20. The method according to claim 17, wherein the type of the pre-existing roadway mark is determined by comparing the image to a set of roadway mark images previously uploaded to a database by an owner, employee, licensor, or agent of the entity depositing the material on the roadway surface.
Description
BRIEF DESCRIPTION OF THE DRAWING
[0127] The invention is best understood from the following detailed description when read in connection with the accompanying drawing. It is emphasized that, according to common practice, the various features of the drawing are not to scale. On the contrary, the dimensions of the various features are arbitrarily expanded or reduced for clarity. Included in the drawing are the following figures:
[0128]
[0129]
[0130]
[0131]
[0132]
[0133]
[0134]
[0135]
[0136]
[0137]
[0138]
[0139]
[0140]
[0141]
[0142]
[0143]
[0144]
[0145]
[0146]
[0147]
[0148]
[0149]
[0150]
[0151]
[0152]
[0153]
[0154]
[0155]
[0156]
[0157]
DETAILED DESCRIPTION OF THE INVENTION
[0158] The present invention provides machine vision and machine learning based roadway marking systems used for repainting or otherwise rehabilitating existing roadway traffic lane demarcation lines on roadway surfaces, a process commonly referred to as maintenance-based restriping, and for repainting or otherwise duplicating original roadway traffic lane demarcation lines previously recorded onto a newly repaved roadway surfaces, a process commonly referred to as layout-based restriping, from a moving vehicle.
[0159] Referring now to the drawing, in which like reference numbers refer to like elements throughout the various figures that comprise the drawing,
[0160] Center skip line 12 usually follows the longitudinal directed center of the roadway 1. A roadway mark path 16 defines the path which center skip line 12 follows, and the longitudinal center line of center skip line 12 is coincident with roadway mark path 16. Mark path 16 is shown as a dashed line on roadway surface 4, and edge lines 10 and 14 are usually offset a given distance in the lateral direction from roadway mark path 16 and are therefore substantially parallel to center skip line 12. It is understood that roadway mark path 16 is not visible on the roadway surface 4 but only illustrates and indicates the longitudinal center line of center skip line 12. Other roadway marks may be offset from roadway mark path 16.
[0161] Usually roadway lane edge lines 10 and 14 are continuous lines but may have breaks or otherwise segments which are not marked. Roadway traffic exit lanes are good examples of where the edge lines 10 and 14 may have breaks or may otherwise not be parallel with mark path 16. Likewise, center skip line 12 could be a single solid line, or a double solid line, or a combination of these or other lines.
[0162] Center skip line 12 comprises a cyclic pattern of roadway line mark segment 18 followed by an unmarked gap segment 20. This mark and gap segments cycle is repeated continuously on roadway surface 4 along roadway mark path 16 but may change depending upon the roadway mark specifications. For example, the center skip line pattern may change to a solid single or double line or even a roadway mark comprising one solid line mark parallel to a skip line, such as a conventional roadway passing mark. The invention is not limited to the particular type of center or edge line patterns and includes solid single- and double-line patterns, single and double skip-line patterns, other patterns or various combinations of line patterns.
[0163] Center skip line 12 has cyclic length 22 with mark segment 18 having length 24 and gap segment 20 having length 26. Skip line patterns may be noted as two numbers separated by a delimiter, the first number indicating the mark segment length 24 followed by the second number which indicates cyclic length 22. For example, a 15/40 (the delimiter is the “/”) skip line pattern defines mark segment 18 length 24 of 15 feet (450 cm) and cyclic length 22 of 40 feet (1,200 cm), yielding a computed gap segment 20 length 26 of 25 feet (750 cm). Many other skip line patterns exist and may include 10/40, etc. Also, skip line patterns may be expressed in metric units (meters).
[0164] A conventional paint vehicle 50 having a right-handed Cartesian coordinate system 52 is further shown moving in a forward longitudinal direction 28 within lane 6 and along roadway mark path 16, restriping the roadway mark line segments of center skip line 12. It is understood that the term “vehicle” is given its broadest meaning, including any conveyance, motorized device, or moving piece of mechanical equipment for transporting passengers or apparatus. More specific and preferred examples of vehicles are trucks and road marking machines.
[0165] As indicated in
[0166] Maintenance restriping of the mark segments applies new roadway marking material substantially over each roadway mark segment and applies new roadway mark material (including reflective elements if specified) especially over worn-away portion 30 and breaks 34 and 36, thereby rehabilitating and maintaining the contrast visibility of the mark segments for a given skip line, or over an entire single or double line, or any combination thereof.
[0167] Restriping of the mark segments onto a newly paved roadway surface applies new roadway marking material substantially over each previously recorded roadway mark segment and applies new roadway mark material (including reflective elements if specified) maintaining the contrast visibility of the mark segments for a given skip line, or over an entire single or double line, or any combination thereof, basically replicating the original roadway marking.
[0168] Roadway mark segments are usually characterized by rectangular shaped marks defined by near and far longitudinal edge lines and beginning and ending lateral edge lines. For example, mark segment 18 is substantially rectangular having near longitudinal edge line 40 (i.e., the longitudinal edge line closest to vehicle 50) and far longitudinal edge line 44 (i.e., the longitudinal edge line farthest from vehicle 50), and beginning lateral edge line 42 (i.e., the first lateral edge line approached by vehicle 50 traveling in direction 28) and ending lateral edge line 46 (i.e., the second lateral edge line approached by vehicle 50). The edge lines form a substantially rectangular-shaped boundary of the roadway mark 18. Lateral edge lines 42 and 46 define the beginning and ending lines, respectively, of mark segment 18, and points 43 and 47 define the center points of lateral edge lines 42 and 46, respectively.
[0169] Further shown in
[0170] Referring additionally to
[0171] The lateral distance between paint guns 84 and 86 may be manually adjusted to accommodate the restriping of parallel double lines (for example, a solid line parallel to a skip line, usually used to designate an allowed passing zone, or two solid lines, usually used to designate a no passing zone, etc.). In a similar fashion, the lateral distance between reflective bead guns 88 and 90 can be manually adjusted to allow lateral alignment with paint guns 84 and 86, respectively.
[0172] Further attached to carriage 80 are laterally extendible, cylindrically shaped support arms 94 and 95. A hydraulic cylinder 411 having a piston 440 connected to a piston rod 447 (hydraulic cylinder 411, piston 440, and piston rod 447 are shown in
[0173] Referring now to
[0174] Thus, as carriage 80 moves in a lateral direction inward to and outward from vehicle 50, line pattern 106 also moves giving a visual indication (for a visible laser line generator) of the lateral position of carriage 80 (and which is imaged by camera 252). The lateral positions of the paint guns 84 and 86 (and their respective nozzles) and bead guns 88 and 90 are therefore also visually indicated by line pattern 106 taking into consideration any fixed offsets between the paint and bead guns and laser line pattern 106. Laser line generator 102 may also be moved laterally along the frame of carriage 80 and positioned so that line pattern 106 is laterally aligned with one of the paint guns, for example paint gun 84.
[0175] Laser line generator 102 may also be mounted to frame 54 projecting the fanned pattern of laser light 104 first horizontally with respect to surface 4 and then subsequently reflected downward by a mirror mounted on carriage 80 again forming line pattern 106 with surface 4. Carriage 80 may further have reflective ruler markings 115 placed onto front frame member 116 of carriage 80, which may be imaged by imaging system 60 and which then may also indicate the lateral position of carriage 80. Laser line generator 102 may also include a conventional laser pointer projecting a substantially circular “spot” pattern onto roadway surface 4 and within imaged area 70.
[0176] Also alternately attached to frame 54 is a conventional draw wire sensor 110 (shown hidden as a dashed outline in
[0177] A laterally moveable paint carriage 130 identical to carriage 80 is attached to the passenger side of vehicle 50 and is shown in a slightly extended position beyond the passenger's side of vehicle 50 in
[0178] Further attached to carriage 130 are laterally extendible cylindrically shaped support arms 144 and 146 (not shown). A hydraulic piston 148 (not shown) is positioned between support arms 144 and 146. The moveable end of hydraulic piston 148 is attached to the side frame member of carriage 130, and the other end of the piston is secured to frame 54 of paint vehicle 50. Hydraulically powering piston 148 provides the necessary force to laterally extend or retract carriage 130 from paint vehicle 50 thereby enabling the positioning of paint guns 134 and 136 along with their respective nozzles and respective bead guns 138 and 140 over a roadway mark.
[0179] Carriage 130 further has a laser line generator or laser pointer mounted to its frame for projecting a laser line onto roadway surface 4 within imaged area 75, reflective ruler markings on the front frame member, and a draw wire sensor or other transducers for determining the lateral position of carriage 130.
[0180] Imaged area 70 includes any pre-existing roadway 1 center skip line 12 (or any other center line which may include single or double solid, or a combination of a skip and a solid line, or any combination thereof) with vehicle 50 travelling anywhere within lane 6. Similarly, imaged area 75 includes any pre-existing roadway 1 edge line 10 with vehicle 50 travelling anywhere within lane 6. Both imaged areas 70 and 75 laterally extend past the full lateral extension of their respective carriages 80 and 130, and also image their respective roadway surface 4 laser line pattern 106 or spot images and/or carriage ruler markings 115.
[0181] As shown in
[0182] Imaging system 60 may also be mounted over carriage 80 on a fixable swingable mount (i.e., the mount can swing back along the side of vehicle 50 when not required) having a frontal field of view projected forward in the longitudinal direction and so positioned to image area 70 and line pattern 106. Imaging system 65 may be similarly mounted over carriage 130 to image area 75 and its respective laser line pattern. Other locations on vehicle 50 for mounting imaging systems 60 and 65 for imaging areas 70 and 75, respectively, are possible. For example, the imaging systems 60 and 65 may be mounted backwards of the cab of vehicle 50 and having a rearward field of view or on the carriages 80 and 130.
[0183] Referring now to
[0184] Outlet ports 186, 188, 190, and 192 of valves 178, 180, 182, and 184 connect to the near ends of flexible conduits 194, 196, 198, and 200, respectively. The far ends of flexible conduits 194 and 196 are connected to paint guns 84 and 86, respectively, and the far ends of flexible conduits 198 and 200 are connected to bead dispensing guns 88 and 90, respectively.
[0185] In response to pressurized air flow, the respective paint and bead guns open permitting the pressurized paint and/or beads to be forcibly dispensed onto roadway surface 4. The flexible conduits allow delivery of air to the paint and bead guns as the carriage laterally moves to align the guns (and their respective nozzles) with the pre-existing roadway mark. The material supply lines to the individual paint and bead guns are not shown to avoid clutter and add clarity in
[0186] Solenoid valves 178, 180, 182, and 184 each have separate positive and negative electrical connections for supplying electrical energy to activate their respective valve switching solenoids. Fused electrical power is supplied to the positive terminals of valves 178, 180, 182, and 184 via terminals 202, 204, 206, and 208, respectively. Fused electrical power to operate the valve solenoids may be derived from a 12-volt battery (not shown) of vehicle 50.
[0187] The negative terminal of valve 178 connects via a line 210 to one terminal of an electronically controlled switch 212. The other end of switch 212 connects to ground via a line 214.
[0188] The on-off state of switch 212 is controlled by an externally generated electrical control signal which flows to the control terminal C via a control line 216. In a similar fashion, the negative terminal of valves 180, 182, and 184 connect via lines 218, 220, and 222 to one terminal of electronically controlled switches 224, 226, and 228, respectively. The other ends of switches 224, 226, and 228 connect to ground via lines 230, 232, and 234, respectively. Similarly, the on-off state of switches 224, 226, and 228 are controlled by an externally generated electrical signal which flows to their respective control terminals C via control lines 236, 238, and 240, respectively.
[0189]
[0190] In addition, each valve 178, 180, 182, and 184 has conventional protective circuitry 242, 244, 246, and 248 (circuitry details are not shown), respectively. The protective circuitry minimizes any generated fly back voltages induced across the respective positive and negative solenoid voltage terminals during solenoid-initiated valve switching.
[0191] A similarly constructed roadway mark material pressurized air control system 250 (shown in
[0192] A manufacturer of commonly used solenoid valves for controlling the distribution of pressurized air to control the flow of paint and/or beads through their respective guns is MAC Valves, Inc. located in Wixom, Mich.
[0193] Referring now to
[0194] Mount 254 enables camera 252 to be independently rotated about the three axes 268, 270, and 272 in directions 274, 276, and 278, respectively, which enables camera 252 to be spatially positioned to image area 70. Mount 254 is further affixed to a vertical leg 280 of angle bracket 258 via conventional bolts 282, 284, 286, and 288.
[0195] Plexiglas globe 256 encloses both camera 252 and mount 254 and further has a distal hemispherical surface 290 and proximal mounting lip 292. Lip 292 has a rectangular-shaped groove 294 for accepting an O-ring 296 and additionally provides surface area 298 for mounting globe 256, via circumferentially arrayed conventional bolts 300, 302, 304, 306, 308, 310, 312, and 314, onto vertical leg 280 of bracket 258. The horizontal leg 281 of right-angle bracket 258 is affixed to the top surface of a compressor enclosure 56 (see
[0196] Imaging system 65 is identical to imaging system 60, having a camera or imager 330, a 3-axis adjustable mount 332, a protective Plexiglas globe 334, and an angle bracket 336 (all not shown, including the parts of those components). Imager 330 is identical to imager or camera 252 having a lens element 338, an optical filter 340, and an optical axis 342. Power, data, and control signals communicate with imager 330 via an electrical cable 344 (all not shown).
[0197] Referring now to
[0198] A splined shaft 415 (not shown) of steering control unit 410 is axially aligned with and is attached to a shaft 412 of electric motor 414 via a connecting hub 417. Steering wheel 416 is axially aligned with and is also attached to shaft 412 of electric motor 414 via hub 417 with conventional circumferentially mounted bolts 419a, 419b, and 419c. Internal to electric motor 414 is a programmable motor controller 413 which externally communicates via a communication bus or cable 421 with computer 702 (shown in
[0199] Reservoir 404 connects to the inlet port 438 of pump 402 via conduit 418. Outlet port 439 of pump 402 connects to the pressure (P) port 446 of steering control unit 410 and the input port of relief valve 408 via conduit 424. The output port of relief valve 408 connects to reservoir 404 via conduit 422. The tank (T) port 441 of steering control unit 410 connects to the inlet port of filter 406 via conduit 430. The output port of filter 406 connects to reservoir 404 via conduit 420. The right port (R) 442 of steering unit 410 connects to the port 444 of cylinder 411 via conduit 426, and the left port (L) 443 of steering unit 410 connects to the port 445 of cylinder 411 via conduit 428.
[0200] Cylinder 411 has piston 440 with connected piston rod 447 which extends and retracts in directions 452 and 453, respectively, in response to hydraulic fluid flow in conduits 426 and 428. The proximal end of piston rod 447 connects to piston 440 and the distal end of rod 447 attaches to the inside of side frame member 118 of carriage 80 at attachment point 96 using a clevis fastener 448. Thus, hydraulically extending rod 447 laterally extends carriage 80 and hydraulically retracting rod 447 laterally retracts carriage 80.
[0201] A clockwise rotation of splined shaft 415 of steering control unit 410, either produced automatically by motor 414 or manually with steering wheel 416, causes a pressure differential between the surface areas of piston 440. This pressure differential forces piston 440, and therefore piston rod 447, to move into hydraulic cylinder 411 in direction 453, thereby laterally retracting carriage 80 into the driver's side of vehicle 50.
[0202] A counterclockwise rotation of the splined shaft 415 of steering control unit 410, either produced automatically by motor 414 or manually with steering wheel 416, causes a pressure differential between the surface areas of piston 440. This pressure differential forces piston 440, and therefore piston rod 447, to extend outwardly from hydraulic cylinder 411 in direction 452, thereby laterally extending carriage 80 outwardly from the driver's side of vehicle 50.
[0203] It is therefore understood that computer 702 may communicate with motor 414 via commands sent to controller 413 via cable 421, and therefore may control the lateral position of carriage 80. Electrically disengaging motor 414 (defined as allowing the free rotation of shaft 412) by computer 702 allows spline shaft 415 of steering control unit 410 to be manually rotated via steering wheel 416 without any interference or assistance from motor 414. With a disengaged motor 414, the lateral position of carriage 80 may be manually controlled as if motor 414 had not been inserted into hydraulic steering system 400.
[0204] Hydraulic steering system 450 (not shown) controls the lateral movement of carriage 130. Hydraulic steering system 450 is identical in every respect to hydraulic system 400 except that the hydraulic connections are reversed on the hydraulic cylinder so that a counter clockwise rotation of the steering wheel (or motor) retracts, and a clockwise rotation of the steering wheel (or motor), extends carriage 130. Other hydraulic systems and other configurations are possible for controlling the movement of carriages 80 and 130.
[0205] Referring to
[0206] Referring to
[0207] Drive shaft 508 is further connected to a conventional rear axle differential which in turn drives the rear axle of vehicle 50. Further attached to the rear axle are driver and passenger side rear wheels 57 (see
[0208] As drive shaft 508 rotates in the direction 510, collars 502 and 504 along with spacer 506 also rotate in the same direction 510. Cylindrically shaped permanent magnets 512 and 514 are imbedded and potted within, and are radially arrayed around the outer circumference of, collars 502 and 504, respectively. Further, collar 504 is rotatably displaced from collar 502 so that magnets 514 are radially aligned between magnets 512. A manufacturer of these types of magnetic shaft collars is Electro-Sensors, Inc. of Minnetonka, Minn. 55343.
[0209] Conventional Hall-effect sensors 516 and 518 are positioned in close proximity to the outer circumference of shaft collars 502 and 504, respectively, and are attached to the body frame 54 of vehicle 50 by conventional mounts (not shown). Sensors 516 and 518 detect the changing magnetic flux produced by magnets 512 and 514, respectively, as collars 502 and 504 rotate in response to rotation in the direction 510 of drive shaft 508.
[0210] In response to the changing magnet flux, sensors 516 and 518 produce active low signals 520 and 522 (represented by pulses 524 and 526, respectively, illustrated along a time or “t” axis in
[0211] Signal 534 is composed of the signals from shaft collars 502 and 504. Having collar 504 rotatably displaced from collar 502 allows twice as many magnetic pulses than that possible from just one collar given a particular shaft collar size and number of magnets per collar. Having additional collar 504 increases the angular resolution of drive shaft 508 rotation per pulse. More collars rotatably displaced from one another may be added to increase the angular resolution of drive shaft 508.
[0212] For example, if collar 502 has a total of 36 magnets then each active low pulse 524 corresponds to an angular rotation resolution of 10 degrees. With second collar 504 also having 36 magnets and rotatably displaced so that magnets 514 are aligned between magnets 512 of collar 502, a second non-interfering active low pulse 526 is produced between pulses 524, in effect giving an angular rotation resolution of 5 degrees. Therefore, each pulse of signal 534 corresponds to a known angular rotation of drive shaft 508 and therefore a known angular rotation of rear wheel 57.
[0213] The longitudinal distance travelled (or the longitudinal distance that will be travelled) by vehicle 50 is then easily determined by counting the number of pulses of signal 534 and multiplying this number by the distance travelled per pulse of signal 534. This distance travelled per pulse value in the past was prone to a multitude of errors as previously mentioned in the background section of this document and is greatly diminished according to the preferred embodiment of this invention.
[0214] Other means exist to generate pulses which represent the longitudinal distance travelled by the vehicle and include generating pulses from the flywheel.
[0215] Interface circuit 538 may incorporate a conventional microprocessor 540 in bi-directional communication with bus interface circuitry 542. Interface circuitry 542 handles all bi-directional communication to and from local bus 544 to microprocessor 540. Microprocessor 540 may input signal 534 from line 536 and be programed by computer 702 to perform computational tasks such as counting a certain number of pulses of signal 534 over a particular time interval via a conventional gating signal. For example, computer 702 may communicate to interface circuit 538 a “start count” command which would instruct circuit 538 to begin counting the pulses of signal 534, and then communicate to interface circuit 538 a “stop count” command which would instruct circuit 538 to stop counting the pulses of signal 534. Computer 702 may then request the total pulse count of signal 534 which occurred between the “start count” and “stop count” commands from circuit 538, whereby circuit 538 would send the total pulse count back to computer 702 via local bus 544, or the pulse count of signal 534 may be synchronously or asynchronously sent to computer 702.
[0216] Bus interface circuitry 542 conditions microprocessor 540 signals intended to be sent onto bus 544 to be compatible with the chosen bus 544 specification, and conditions signals received from bus 544 intended to be sent to microprocessor 540 to be compatible with the signal specifications of microprocessor 540. Bus 544 may include, for example, conventional CANopen or EIA-485 (formally referred to as RS-485) communication protocol specifications. Thus, interface circuitry 542 is in bi-directional communication with computer 702 (and other components shown in
[0217] Referring now to
[0218] An identical machine vision and machine learning computer control system 750 controls carriage 130 but is not shown, except that system 750 will not include a GPS receiver (only one RTK enabled GPS receiver 154 and GPS antenna 152 are shown for vehicle 50) or a drive shaft positional encoder or sensor 500 (only one drive shaft positional encoder or sensor 500 is required for vehicle 50). Computer 702 is in bidirectional communication with a similar computer 752 (not shown) of identical machine learning based control system 750 via a bi-directional bus 710. Alternately, the tasks performed by computer 752 may be managed entirely by computer 702.
[0219] Computer 703 is in bi-directional communication (i.e., sends and receives data) with and among various components, including GPS receiver 154 and IMU 155, imager 252, pressurized air control system 160, valve 427, drive shaft positional sensor 500, linear positional sensor 110 and precision time protocol (PTP) network timing unit 111 via master bus 712 and local busses 724, 266, 714, 421, 544, 716 and 113 respectively, and each component is in bi-directional communication with each other.
[0220] Master bus 712 may be composed of a number of different individual local busses, each individual local bus having different electrical and mechanical specifications supporting their respective communication specifications. Bus 712 may also provide power. For example, local bus 266 may be a camera link bus, USB compatible or ethernet bus, local bus 421 may be a CANopen bus and local bus 113 may be an ethernet bus and, when grouped or bundled together form part of master bus 712.
[0221] Keyboard 706 and liquid crystal (or similar) display 704 are conventional computer peripherals and are connected to computer 703 via bidirectional universal serial buses (USB) 718 and 720, respectively. Keyboard 706 allows an operator to enter alpha-numeric and other data (for example, commands such as “LAYOUT” or “STRIPE”) into computer 703 and display 704 displays information from computer 703 for viewing by the operator. Display 704 may also be a conventional “touch” display allowing the operator to both view information and enter data by selectively touching areas displayed on the display 704, similar to the displays used on “smart” cell phones such as the Apple 11 phone. In addition, a conventional computer-compatible mouse and joystick are also provided (not shown) for entering data into computer 703 by the operator.
[0222] Power supply 708 supplies electrical power to all components shown in
[0223] GPS antenna 152 is electrically in communication with GPS receiver 154 via electrical cable 158. Antenna 152 and receiver 154 are adapted to receive conventional GPS signals 156 from any GPS satellite system (for example, the Russians' GLONOSS system or the United States' Global Positioning System), or from a GPS-pseudolite system. In addition, receiver 154 is further adapted to use RTK data via a separate communication channel (not shown) to compliment the satellite-derived GPS data thereby increasing the GPS positional accuracy of vehicle 50.
[0224] The single (master) antenna/receiver GPS system shown in
[0225] An inertial measurement system (IMU) 155 may also be combined with the GPS receiver 154 to complete an inertial navigation system (INS). Typical INS systems include a first and a second GPS antennas and receivers and can be used to determine heading information along with roll, pitch and yaw information of paint truck 50. An example of an INS system using a first and second GPS antennas (and receivers) is model number n580 manufactured by Honeywell Aerospace.
[0226] GPS receiver 154 decodes signals received by antenna 152 and uses RTK data (via the separate communication channel) to determine the geographical location (longitude, latitude, and altitude, or the ECEF position) of antenna 152. The location of antenna 152 is known with respect to coordinate system 52.
[0227] Bi-directional communication with GPS receiver 154 among the other components of system 700 is via local bus 724 and master bus 712. As previously stated, only one RTK enabled GPS system is required on vehicle 50 for accurately measuring travelled distance of vehicle 50.
[0228] Imager 252 is a conventional progressive scan CMOS imager for capturing an image 253 having a CMOS sensor with a rectangular-shaped pixel array usually arranged in a rectangular format for converting light into electrical signals, such as GigE compatible model number acA1440-73gc manufactured by Basler AG of Ahrensburg, Germany. For example, the sensor for the acA1440-73gc has an array of 1440×1080 active pixels and conforms to the GigE specification. It is noted that more than one imager may be incorporated into striping system 700.
[0229] Attached to imager 252 is conventional lens 260 which may have optical band pass filter 262 (shown in
[0230] Included within imager 252 is electronic circuitry (not shown) which communicates status, control, and image data using a conventional ethernet bus interface via local bus 266 and master bus 712 to computer 703. Further, imager 252 may be triggered to acquire an image from a trigger signal derived directly from GPS receiver 154 and IMU 155 combination through computer 703 (via an image acquisition system 726) or from other time-deterministic trigger sources (i.e., the time of occurrence of the trigger signal is known), or free run, i.e., acquire images asynchronously, in which case the images are time-stamped using PTP timing unit 111.
[0231] Imager 252 is a GPS calibrated imager which provides an accurate GPS location for each pixel of the roadway image (including a roadway mark) captured by imager 252. The GPS position of each pixel of the image may be offset corrected and may be referenced back to the vehicle Cartesian coordinate system 52.
[0232] Linear position (transducer) sensor 110 measures the relative lateral linear displacement of carriage 80 with respect to frame 54 of vehicle 50. For example, linear position sensor 110 may be a conventional industrial digital CANopen draw wire sensor model number WDS-5000-P115 manufactured by Micro-Epsilon of Raleigh, N.C. (United States office) having the sensor housing mounted on frame 54 of vehicle 50 and a flexible steel (Teflon-coated) wire affixed to side frame member 118 at attachment point 96 of carriage 80. Linear sensor 110 may also be a conventional laser range finder affixed to frame 54 and focused on a reflective target mounted on the inside of side frame member 118 of carriage 80, or may be a conventional linear variable differential transformer (LVDT). The piston 411 may also incorporate an internal linear displacement sensor which is commonly referred to as a “smart cylinder”. A manufacturer of smart cylinders is Aggressive Hydraulics, Inc. of Cedar, Minn.
[0233] Position and other data are communicated between sensor 110 and computer 703 via local bus 716, which then becomes a member of master bus 712. Computer 703 may poll (request) sensor 110 for positional information or sensor 110 may continuously send positional data to computer 703. The position of carriage 80 is known via sensor 110 with respect to coordinate system 52 (offset adjusted).
[0234] Thus, it is understood that the relative lateral positional movement of carriage 80 with respect to frame 54 is determined by sensor 110, and relative distances moved by carriage 80 can be calculated from differences in position locations, as well as position locations (and distances) of objects mounted on carriage 80, including the positions of paint and bead guns and their respective nozzles, also relative to coordinate system 52 (offset corrected).
[0235] Pressurized air control system 160 is in communication with computer 702 via local bus 714 and master bus 712. Thus computer 703 can control the dispensing of roadway mark material via pressurized air control system 160.
[0236] Computer 703 also has internally available peripheral component interconnect (PCI) expansion slots and/or peripheral component interconnect express (PCIe) expansion slots. For example, computer 703 may be provided with a conventional PCIe input-output board inserted into a PCIe compatible expansion slot for sending and receiving digital control signals from computer 703 to external peripherals, such as conventional roadway mark material pressurized air control system 160, and for receiving digital signals from external peripherals to computer 703. Additionally, computer 703 includes a graphical processing unit 713 such as that offered by NVIDIA model number TITAN V which is PCIe slot compatible. Other types of expansion slots may be provided to accommodate additional peripheral cards.
[0237] Computer 703 further includes an image acquisition system 726 for hardware interfacing imager 252 with computer 703. Acquisition system 726 may include a conventional frame grabber PCIe expansion slot compatible image frame grabber card such as model number NI PCIe-1433, a high-performance camera link frame grabber card manufactured by National Instruments Corporation of Austin, Tex. For GigE based imagers, a frame grabber card is not required.
[0238] System 726 also includes a random-access memory (RAM) buffer for storing acquired images 253 from imager 252, and handles all of the software overhead (control, image data transfers, etc.) for interfacing imager 252 to computer 703. Acquisition system 726 further has an external image trigger input 728. In response to an external trigger signal placed onto input 728, acquisition system 726 sends a control signal to imager 252 via busses 712 and 266 to acquire or otherwise “snap” an image 253 at a known time. Image 253 data (pixel intensity (grayscale or color) and pixel array location values) are then transferred from imager 252 to the on-board buffer memory of acquisition system 726 via the respective busses and then subsequently transferred to data memory 820 (shown in
[0239] Computer 703 also includes conventional timing module 730 which may be programmed either by computer 703 or from an external programming source via signals placed on line 732 to perform certain timing tasks, and may be used as a trigger source for acquiring images from imager 252 at known times or to provide a time-stamp for asynchronously acquired images.
[0240] An external trigger source (not shown) generates and accurately controls the timing of the external trigger signal and may be programmed by computer 703 to produce various trigger signals. For example, the trigger source may be programmed to generate a periodic trigger signal having a known frequency. In response to the periodic trigger signal, imager 252 acquires a sequence of images 253 having accurate and known time intervals between each acquired image. A sequence of images 253 may then be acquired in response to a deterministic external trigger signal.
[0241] Additionally, the trigger signal may be position dependent, i.e., the trigger signal is produced after the vehicle has travelled a certain distance (determined by the number of pulses).
[0242] The trigger source may be a conventional programmable signal generator, or may be derived from the computer internal timer, a timing module 730, an external microcontroller-based system or GPS receiver 154.
[0243] Images 253 may also be obtained asynchronously from imager 252 and subsequently time-stamped using an external precision time protocol (PTP) network timing unit 111. An example of a PTP timing unit 111 is model number TM2000A manufactured by TimeMachines, Inc. of Lincoln, Nebr. Computer 703 is configured to communicate with timing unit 111 via bus 712 and local bus 113.
[0244] Therefore, it is understood that a sequence of images may be acquired by imager 252 and placed into data memory 820 (see
[0245] The acquired image is stored in memory 820 as an array of grayscale values for black and white images or as a three-dimensional array for color images having a one-to-one correspondence with the pixel array. For example, a black and white CMOS sensor having a 640×480 pixel array will output a 640×480 array of grayscale values for each image. Alternately, a color imager having a 640×480-pixel array would have, for example, three 640×480 arrays to accommodate red, green and blue (commonly referred to as RBG) image color values. Other color models may be used.
[0246] Additionally, each pixel of an acquired image from imager 252 has a known GPS location relative to the Cartesian coordinate system 52 (offset corrected), which is stored in data memory 806 (imagers 252 and 253 are GPS calibrated).
[0247] Referring to
[0248] Operating system software 802 may include a real time operating system (RTOS), UNIX, LINUX, Windows (offered by Microsoft Corporation), or other compatible operating system software, and performs conventional operating system software functions and is capable of executing various programs stored in program memory 804 of computer 703.
[0249] Program memory 804 includes an image correction program 808, a pixel-to-distance program 810, an image analysis program 812, a mark path projection program 814, a machine vision carriage control program 816, a dynamic positional calibration program 818, and a dispensing control program 820.
[0250] Image correction program 808 inputs raw image data acquired from camera 252 and subsequently corrects the raw image data for optical pin-cushion or barrel distortion produced by lens 260 (and possibly Plexiglas globe 256) and then secondly corrects for perspective distortion using a conventional homography algorithm. Both the raw image and corrected image data are stored in data memory 806, along with the GPS location for each pixel of each acquired image. The GPS position for each pixel may also be computed from the known GPS position of image space origin 905 and the pixel-to-distance program 810.
[0251] For example, the object space (i.e., the actual physical field of view of camera 252) of area 70 includes substantially rectangular-shaped roadway mark segment 18, having near longitudinal edge line 40 (i.e., the longitudinal edge line closest to vehicle 50) and far longitudinal edge line 44 (i.e., the longitudinal edge line farthest from vehicle 50), and beginning lateral edge line 42 (i.e., the first lateral edge line approached by vehicle 50) and ending lateral edge line 46 (see
[0252] Referring now to
[0253] Referring to
[0254] The corrected undistorted image 914 data are then stored into data memory 806. Each image (both raw and corrected) has a tagged GPS location (and also each pixel has an associated GPS location) and is time stamped and stored in data memory 806 along with the respective image.
[0255] Also shown in both
[0256] The respective distortion parameters required by image correction program 808 to correct for optical distortion are determined by a conventional optical distortion correction program, such as offered by The MathWorks, Inc. of Natick, Mass., and which is known in the camera calibration art. In addition, perspective distortion is then corrected using a homography transformation (it is assumed that the roadway surface 4 is planar within the field of view of camera 252) of the optically undistorted image. Image data of dimensionally defined 2-D checker-board patterns are used by image correction program 808 to determine the corrections necessary to minimize the optical and perspective distortions, along with the appropriate software. Moreover, the image u-v coordinates may extend beyond the actual undistorted image boundaries, again assuming the roadway surface 4 is planar within the field of view of camera 252, and in particular includes the area under the paint and bead guns and their respective nozzles.
[0257] Pixel-to-distance transformation program 810 transforms each pixel of the undistorted image into an equivalent object space distance (for example, one pixel in image space in the u-axis direction of the corrected image may correspond to 0.25 inches or 6 mm in object space in the x-direction), or transforms each object space coordinate into a corresponding undistorted image space coordinate. Further, the x-y-z object space location of each pixel is determined and referenced to coordinate system 52 by conventional calibration methods. In particular, pixel-to-distance transformation program 810 may determine the object space x-y-z coordinates of the image space origin 905 of the undistorted image from which all other pixel coordinates in object space may be determined. Data necessary for performing this transformation are again experimentally determined from known object space x-axis and y-axis dimensions of an imaged checkerboard pattern, and determining the z-axis coordinate of the roadway surface 4. Thus, every pixel (in image space) has an equivalent object space x-y-z axis coordinate referenced to coordinate system 52 and vice versa, and also a GPS (RTK enhanced) location. These pixel-to-object and object-to-pixel distance transformation data are then stored in data memory 806.
[0258] Referring now to
[0259] Within image analysis program 812 is a supervised machine learning network (algorithm) 815 which inputs the corrected image 914. Machine learning networks may include fully or partially connected conventional neural networks, convolutional-based neural networks (CNNs) or support vector machines (SVMs). All the above-mentioned machine learning networks are well known in the art.
[0260] The machine learning network 815 is configured to classify each pixel of image 914 and outputs a classification array 817 for the image 914. The size of the pixel array of image 914 equals the size of the classification array 817, and each element of the classification array 817 is one-to-one mapped to each pixel of image 914, and the size of the pixel array of image 914 equals the size of the classification array 817. For supervised convolutional neural networks, the classification array 817 may give a probability (i.e., confidence) having a range [0,1] that a certain pixel belongs to the classification. One or more classification schemes may be used. One classification scheme may define the confidence level for a pixel imaging a roadway mark, and another classification scheme may define the confidence level for a pixel imaging a roadway surface, and yet another classification scheme may define the confidence level for a pixel imaging a yellow roadway mark, and a further classification scheme may define a logical ‘1’ or ‘0’ for a pixel which belongs to a roadway mark or not to a roadway mark respectively, and may include combinations of individual classification schemes. The classification schemes are “learned” during training of the machine learning network.
[0261] For training machine learning networks for this application, roadway mark data is imaged, and the network automatically adjusts internal parameters (commonly referred to as weights and biases) to minimize a cost function. A program called backpropagation is typically used to minimize the cost function.
[0262] One source of roadway mark data for training the networks is that provided by CARLA, a simulator for producing realistic roadway marks (see “CARLA: An Open Urban Driving Simulator” by Dosovitskiy et. al., 1.sup.st Conference on Robot Learning, Mountain View, Calif.). Additional training data may be obtained from roadway mark recorded image data.
[0263]
[0264] Image analysis program 812 further has a filtering program 829 which operates on the classification array 817 confidence values and sets the respective confidence values to a 1 or 0 depending upon a programmable threshold value 831. For example, the threshold value 831 may be set to 0.9 which could mean that any confidence values of greater than or equal to 0.9 are set to “1” and any confidence value less than 0.9 are set to “0”. The machine learning network 815 may also be trained to directly output a 0 or 1 classification.
[0265] The output 833 of the filtering program 829 is therefore an array of is and Os mapped to the corresponding pixels, with the is representing the specified classification (and meeting the threshold criteria) for example, if the imaged pixel represents part of a roadway mark.
[0266]
[0267]
[0268] The output 833 of the filtering program 829 then flows to the conventional image segmentation program 835. Segmentation program 835 includes enhancement and restoration methods and other image processing methods to restore misclassified data 834 and outputs array 836 sharply defining the boundary 837 of the imaged roadway mark representation as illustrated in
[0269] Each element of array 836 maps to a pixel and therefore has known image and object coordinates, as well as a GPS location. Array 836 (along with the indexed data) is referred to as “array space”.
[0270] Image processing including image segmentation, enhancement and restoration methods and other image processing methods and are referenced in many texts including Rafael C. Gonzalez and Richard E. Woods, Digital Imaging Processing (2d ed., Prentice Hall, 2002) and is part of program 812.
[0271] Using array 836 data, program 812 further performs numerous geometric and array type calculations. For example, the center point 838 of vertical boundary line 839 can be determined as well as the center point 842 of vertical boundary line 844 by counting the number of 1's in the vertical boundary line columns and dividing this number by 2, thereby determining the center location of the roadway mark path 16. The center points 838 of subsequent vertical boundary lines may be used for computing a best-fit continuous roadway mark path. Other points may be used.
[0272] For example, the bottom and top location of the vertical boundary line column is (18,9) and (8,9) respectively. The length is computed by taking the difference between these array elements and equals 10 array 836 units. This difference can also be used to determine the width of the roadway mark in either image or object space using program 810. Both the image and object space distances and coordinates of the roadway mark may be determined from array 836 using program 810. Additionally, the GPS location of any array element of array 836 is known.
[0273] Image analysis program 812 also determines both the beginning and ending lateral edge lines 42 and 46, and the center points 43 and 47 of the beginning and ending lateral edge lines 42 and 46, respectively, from a sequence images and their corresponding array 836 data of the roadway mark segment 18. Additionally, the image analysis program 812 also determines the image and object space coordinates of lines 42 and 46 and center points 43 and 47 using data from pixel-to-distance program 810. The GPS location of the center points 43 and 47 are also known. Other geometric attributes of the roadway mark may be determined by program 812.
[0274] In addition, program 812 determines the image space coordinates of the imaged laser line pattern 106a (and therefore its respective elements in array 836) from the corrected image and determines its corresponding image and object space coordinates. The results of image analysis program 812 are stored in data memory 806.
[0275] Image analysis program 812 also inputs pulse count data from drive shaft positional sensor 500 and can perform calculations using these and other data.
[0276] Image analysis program 812 also may determine if the imaged roadway mark segment 18 comprises a single or double line, a solid or skip line, or any combination and the line patterns using image and object space calculations and conventional image and array processing algorithms, in addition to the mark color. The type of line being imaged, its location (GPS, object space and image space locations) and time of acquisition is stored in data memory 806.
[0277] Image analysis program 812 also determines the speed of vehicle 50 by determining the array 836 coordinate difference between features of successive images of the roadway mark segment 18 (for example the image of the beginning line 42) and converting this difference to object space distance using data from pixel-to-distance program 810, and then taking the difference in time between the successive images. The time each image was acquired and time interval between images are determined by the timing of a trigger signal placed onto trigger input 728, or by other aforementioned deterministic-derived trigger signals previously mentioned (for example, timing signals derived from GPS receiver 154, PTP unit 111 or other timing means) and are known by image analysis program 812. The speed of vehicle 50 is then determined knowing both the object distance travelled and the amount of time to travel this distance (distance/time). Data from GPS receiver 154 may also be used to determine distances and time intervals, and therefore the speed of vehicle 50.
[0278] Image analysis program 812 can also analyze array 836 data produced from a sequence of images (and also the corrected images) and determine the gap and mark segment lengths and determine the skip line pattern (for example, a 15/40 pattern) and type of roadway marking.
[0279] In addition, program 812 can also determine the presence or absence of a roadway mark at a particular GPS location and cause the dispensing control program 820 to dispense roadway mark material. This feature is important for layout-based restriping.
[0280] Image analysis program 812 further includes an image stitching program which creates a continuous image of the sequence of corrected images and corresponding array 836 data. Creating one continuous image from a sequence of images is well known in the art. For example, MATLAB's documentation provides an example of automatically creating a panorama using feature based image registration techniques. Other stitching programs and techniques may be additionally used.
[0281] Image analysis program 812 additionally inputs the discrete GPS coordinates of the images and computes a roadway mark path continuous smooth (spline based) mathematical function which fits the discrete GPS coordinates of the images. This continuous function will be used for the layout-based striping process.
[0282] For example, program 812 may use the center points 838 for determining a best-fit roadway mark path.
[0283] All data produced from program 812 is stored in memory 806.
[0284] As previously noted, program 812 includes a machine learning algorithm for classifying individual pixels (referred to semantic classification) of the roadway surface image.
[0285]
[0286] Mark path projection program 814 computes an equation (mathematical model) which predicts the roadway mark path 16 in array 836 space (and/or image and/or object space) based upon a sequence of array coordinates of individual center points 838 and 842 sequentially determined from a sequence of corrected images and image analysis program 812. Other available array 836 element coordinates may be defined and used in calculations. In addition, mark path projection program 814 may also determine an equation which predicts the mark path 16 projection in array space (and/or object using pixel-to-distance program 810), and may also compute a smooth best-fit roadway mark path.
[0287] For example, two coordinate pairs in array space may be used to develop a straight line mathematical model (a conventional y=mx+b linear equation) of the roadway mark path 16 in either array or object space, and three coordinate pairs may be used to develop a quadratic or other type of interpolated curvature model. This information is then used along with the image space-to-object space conversion values from pixel-to-distance program 810 to develop an object and array space prediction model of the roadway mark path 16 (the mark path followed by, for example, the center point 43 of edge line 42), and to project the mark path rearward of vehicle 50 especially over carriage 80 and the paint and bead gun area. Using array 836 data in calculations (array space calculations) may prove to be more convenient.
[0288] Machine vision and carriage control program 816 positions carriage 80 so that paint gun 84 along with its respective nozzle and its associated bead gun 88 are placed over a pre-existing single roadway mark segment 18. It is assumed that paint gun 86 and its bead gun 90 have been laterally adjusted to accommodate a second roadway mark if gun 84 is properly aligned with its respective roadway mark segment 18. Any number of paint and/or bead guns may be accommodated. Machine vision and carriage control program 816 may use either the array or object space roadway mark path 16 mathematical projection model from mark path projection program 814.
[0289] Machine vision and carriage control program 816 also computes the intersection point of the lateral object space projection line 81 equation and the roadway mark path 16 object space path projection equation derived from array 836 space. The projection line 81 equation is computed from calibration points (and may also be transformed into array space coordinates for an “extended” array space underneath the carriage).
[0290] The coordinates of the intersection point define the alignment location of paint gun 84 and its respective nozzle (and bead gun 88) to dispense roadway mark material directly over roadway mark segment 18.
[0291] The current position of carriage 80 (and, therefore, the current position of paint gun 84 and its respective nozzle, offset adjusted) may be computed by image analysis program 812 using coordinates of the corrected imaged laser line pattern 106a of laser line pattern 106. Machine vision and carriage control program 816 then uses the intersection point of the lateral projection line 81 equation and the roadway mark path 16 path projection equation in array (or object space) to compute the required coordinate position of the imaged laser line pattern 106a of the projected laser line pattern 106 to laterally position paint gun 84 and its respective nozzle on top of roadway mark path segment 18. The current carriage 80 position is also known by linear position sensor 110.
[0292] Machine vision and carriage control program 816 also corrects for offsets among the paint and bead guns and laser line pattern 106 and other system offsets. Knowing the current lateral position of the carriage 80 with linear position sensor 110 and the equivalent lateral object space coordinates of the roadway mark (derived from array 836 data) may also be used to laterally align the paint gun 84. Additionally, machine vision and carriage control program 816 can extend or retract carriage 80 to laterally align the paint gun 84 with the roadway mark from the array 836 data (which has the mapped laser line 106a and the roadway mark representations, or may use carriage 80 position data form linear position sensor 110).
[0293] Referring now to
[0294]
[0295]
[0296] It is noted that the images produced by imager 252 may be dynamically utilized (i.e., in real time) for maintenance-based striping or may have been previously recorded and stored in data memory 806 for layout-based striping.
[0297] Control system 1700 comprises a mark path projection system 1701 (which may comprise mark path projection program 814), a machine vision and machine learning based carriage control system 1720 (which may comprise machine vision and carriage control program 816), hydraulic system 401, the linear position sensor 110, imager 252, an image correction system 1725 (which may comprise image correction program 808), an image analysis system 1730 (which may comprise image analysis program 812). System 1720 further comprises a mark alignment calculator 1703, a comparator 1705, and a carriage position controller 1710. Systems 1701, 1720, 1725, and 1730 may be implemented in software, hardware (such as an FPGA), or a combination of software and hardware.
[0298] Mark path projection system 1701 inputs data from image analysis system 1730 via a line 1740 and creates a roadway mark path 16 mathematical projection model in array (and/or object) space as previously described.
[0299] This model is then used by mark alignment calculator 1703 to calculate the intersection point between the lateral projection line 81 array (and/or object) space equation and the roadway mark segment 18 path projection equation to predict the image space lateral position of the actual roadway mark segment 18 as it passes under carriage 80 at the position of the paint gun lateral projection line 81 in array (and/or object) space (it is assumed that the array and/or object space equation of lateral projection line 81 has been previously determined by calibration). This intersection point is the desired lateral position in array (and/or object) space of paint gun 84 and its respective nozzle to dispense the roadway mark material (paint) directly over and onto the pre-existing roadway mark segment 18.
[0300] The desired lateral position image space coordinate data are then input into the positive (+) input of comparator 1705.
[0301] Comparator 1705 takes the difference between the desired lateral position of the paint gun 84 and its respective nozzle to dispense the roadway mark material directly onto the pre-existing roadway mark segment 18 and the actual lateral position of paint gun 84 and its respective nozzle determined by the linear position sensor 110 and calibration data (offset corrected) in either array (and/or object space), and generates an error signal 1707. Error signal 1707 is then input into carriage position controller 1710.
[0302] Also, from the sequence of corrected images, the coordinates of imaged laser line pattern 106a of laser line pattern 106 are determined by image analysis system 1730, and hence the actual image array (and/or object) space coordinates of the paint gun 84 and its respective nozzle are known (offset corrected). The coordinates of the paint gun 84 and its respective nozzle are known (offset corrected) in array (and/or object) space with respect to image 106a. These data are then input into the negative (−) input of comparator 1705.
[0303] Controller 1710 sends hydraulic system 401 positional commands to valve 427 via CANopen interface 429. The positional commands provide smooth coordinated movements with a carriage velocity profile consistent with roadway marking systems.
[0304] In response to the positional commands received from carriage position controller 1710, hydraulic system 401 via valve 427 either extends or retracts carriage 80 thus changing the lateral position of the paint gun 84 (and also bead gun 88). If the error signal 1707 equals zero, the hydraulic system 401 maintains the current carriage 80 lateral position (hence the current paint gun 84 and its respective nozzle lateral position).
[0305] Changing the lateral position of carriage 80 also laterally moves imaged laser line pattern 106a, and machine vision and learning based carriage control system 1720 moves carriage 80 in a lateral direction which minimizes error signal 1707 thereby aligning paint gun 84 with roadway mark segment 18 as in a conventional classical (servo) feedback system. Similarly, the lateral position of the carriage may also be determined by the linear position sensor 110 and is known in both array and object space.
[0306] It is therefore understood that the location of carriage 80 is automatically adjusted to correctly position paint gun 84 and its respective nozzle over a roadway mark segment 18 using machine vision and machine learning technology.
[0307] Referring now to
[0308] The roadway mark path continuous function produced by program 812 is directly used by 1720 to control the position of the paint gun 84 (and the respective bead gun 88) directly over the roadway mark path continuous function.
[0309] Also produced by program 812 is a dispensing command which is used by the dispensing control program 820 for dispensing roadway mark material. Program 812 determines at what location along the roadway mark path continuous function to dispense roadway mark material to either replicate the original pre-recorded roadway mark or to stripe layout marks.
[0310] It is therefore understood that the location of carriage 80 is automatically adjusted to correctly position paint gun 84 and its respective nozzle over the continuous roadway mark path determined by program 812 using machine vision and machine learning technology.
[0311] It is now necessary to determine when to turn-on and turn-off the paint gun 84 to correctly and accurately duplicate a pre-existing roadway mark segment 18 for maintenance-based striping. It is understood that the control of gun 86 is similarly controlled by system 700.
[0312] Dynamic positional calibration program 818 dynamically calibrates the pulse-to-distance ratio of drive shaft positional sensor 500 by computing the pixel difference in array space between common features of roadway mark images, such as center point 838 in
[0313] For example, in
[0314] This technique does not rely upon tire diameter or pressure and is therefore more accurate and dynamic than conventional methods, i.e., this technique auto-calibrates with every image taken which has an identifiable feature for each image. A new current distance-per-pulse ratio is therefore calculated continuously with each image and is not a fixed value as currently assumed in the industry. Alternately, the number of array 836 elements per pulse may also be determined as 6 array elements/50 pulses=0.12 array elements per pulse. Alternately, distance travelled by vehicle 50 can be determined conventionally using the drive shaft encoder 500 and a constant (i.e., static) distance-per-pulse ratio.
[0315] Dispensing control program 820 controls pressurized control system 160 (via an I/O board in computer 703 as previously disclosed) and determines which solenoid valves to activate and the time duration. Dispensing control program 820 also considers the turn-on and turn-off delays of the individual valves during the dispensing of the respective roadway material (paint and bead).
[0316] Referring now to
[0317] From the image of area 70, the object space location of mark segment 18 beginning line 42 is known relative to the origin 1002 of the object space x-y coordinate system (programs 810 and 812 perform this image space-to-object space transformation), and in particular the longitudinal distance 1004 from line 42 to origin 1002 is determined. The longitudinal distance 1006 from gun 84 to origin 1002 has been previously determined via a calibration procedure, and the distance 1008 has been determined by using a ruler or other calibration methods. Therefore, distances 1010 and 1012 are simply determined by adding distances 1004 and 1006 and adding distances 1004, 1006, and 1008, respectively. The time to turn-on paint gun 84 is when line 42 in under paint gun 84 or, equivalently, when line 42 has travelled a total distance 1010. The travelled distance 1010 is determined by the distance-to-pulse ratio previously determined in dynamic positional calibration program 818 or conventionally by counting (not dynamically) of pulses 534 from sensor 500. Similarly, distance 1012 is calculated by counting pulses 534 equivalent to distance 1012. Travelled distance can also be calculated using the GPS receiver 154 and IMU 155 combination. Dispensing control program 820 also takes into consideration the turn-on and turn-off times of the respective guns. Equivalent calculations may be also performed in array 836 space.
[0318] Dispensing continues until the number of pulses 534 equals the equivalent distance of roadway mark segment length 24 which has been previously input into computer 702 via keyboard 706 by the operator or determined from images using image analysis program 812. Also note that dispensing does not occur for the next mark segment 32 until an accumulated pulse count equal to the distance of roadway mark gap segment length 26 has been obtained either determined by operator input or from images using image analysis program 812. Because the distance-to-pulse ratio is continuously updated and dynamically calculated, accurate maintenance striping of the roadway mark elements occurs without the need for additional carriage operators to force a lead or lag time adjustment during the dispensing cycle. Alternately, just a static previously calibrated distance-per-pulse ratio may be used. Distances may also be calculated using the speed of vehicle 50 and time.
[0319] Although the above discussion refers to a single yellow or white skip line roadway mark segment 18, similar procedures can be implemented to apply roadway mark material to a double skip line mark, or to single or double solid line mark(s) or combinations thereof.
[0320] In operation and referring additionally to
[0321] In step 1100, the operator positions vehicle 50 along a desired roadway mark path 16 having a roadway mark which needs to be rehabilitated (restriped) with location assistance being supplied by GPS receiver 154 and IMU 155 and display LCD 704. It is assumed that all calibration and offset data have been previously obtained and are stored in data memory 806. The driver then inputs the desired line stripe pattern (single or double, solid or skip line(s), for example 15/40) and the type of roadway mark material (paint or paint and bead) to be dispensed using keyboard 706 and depresses the start button on keyboard 706. Alternately, system 700 may determine the line stripe pattern by determining the length 24 of mark segment 18 and length 26 of gap segment 20 using image analysis system 1730 and dispenses the roadway mark material onto the desired roadway mark, or the system may dispense roadway mark material on the currently visible roadway mark (what you see is what is striped).
[0322] The driver then proceeds to drive vehicle 50 along roadway mark path 16 at a vehicle speed consistent with the type of roadway mark material being used for the restriping process. Continuously updated GPS positional data may be displayed on LCD display 704 (as is currently available in automobiles) to assist the driver in positioning vehicle 50 at the correct starting location and along the roadway mark path 16.
[0323] It is assumed that all calibration and offset data have been previously obtained and are stored in data memory 806. Program flow then continues to step 1102
[0324] In step 1102 and in response to the “START” button being depressed on keyboard 706, system 700 (and system 750 if vehicle 50 is so equipped) acquires a first time stamped raw image of the beginning of roadway mark segment 18 (see
[0325] In step 1104, the first time stamped raw image data are undistorted by image correction system 1725 and the first time stamped undistorted image of roadway mark segment 18 is stored in data memory 806 along with the time stamp and corresponding GPS positional data. Program flow then continues to step 1106.
[0326] In step 1106, the array 836 coordinates of a first center point 838 (see
[0327] In step 1108, system 700 (and system 750 if vehicle 50 is so equipped) acquires a second timed stamped raw image of roadway mark segment 18 and GPS positional data. This second image occurs after, and is displaced from, the first image because vehicle 50 is moving along the roadway mark path 16 in direction 28. Program flow then continues to step 1110.
[0328] In step 1110, the second time stamped raw image data are undistorted by image correction system 1725 and the undistorted image of roadway mark segment 18 is stored in data memory 806 along with the time stamp and GPS positional data. Program flow then continues to step 1112.
[0329] In step 1112, the array 836 coordinates of a second displaced center point 838 of the second image are determined by image analysis system 1730. Program flow then continues to step 1114.
[0330] In step 1114, a roadway mark path 16 projection equation in array space is calculated using the first and displaced second image center points 838 (in this case the equation will be a line) using mark path projection system 1701. (Moreover, the array coordinates of the imaged laser line pattern 106a of laser line pattern 106 is determined by image analysis system 1730 for the acquired images if used). Program flow continues to step 1116.
[0331] In step 1116 and based upon the mark path projection equation derived from mark path projection system 1701 and the previously stored lateral projection line 81 equation, an intersection point is determined between these two equations in array (and/or object) space using system mark alignment calculator 1703 of machine vision based carriage control system 1720. Carriage 80 is positioned (offset corrected) to align paint gun 84 and its respective bead gun 88 directly over roadway mark segment 18 using the linear positional sensor 110 to determine the carriage lateral position (offset corrected). The imaged laser line pattern 106a may also be used to determine the lateral position of the paint gun 84 and its respective bead gun 88. Program flow continues to step 1118.
[0332] In step 1118, the array 836 coordinates of the first center point 838 are subtracted from the u-v coordinates of the second center point 838 and the number of system 500 generated pulses occurring for this difference is determined by image analysis system 1730. Program flow continues to step 1120.
[0333] In step 1120, the array (and/or object) space distance from the second center point 838 to the intersection point between the mark path projection equation and the previously stored lateral projection line 81 is calculated by image analysis system 1730. Program flow continues to step 1122.
[0334] In step 1122, the number of system generated pulses required to cover the distance from the last imaged center point 838 to the intersection point along the projected image line of roadway mark path 16 is determined by image analysis system 1730. Program flow continues to step 1124.
[0335] In step 1124, when the number of drive shaft positional sensor 500 generated pulses has occurred as determined in step 1122, dispensing control program 820 controls the pressurized air flow via system 160 to gun 84 (and bead gun 88 if required, as previously input by the operator in step 1100). In response to the pressurized air, gun 84 (and bead gun 88 if required) begins dispensing roadway mark material onto, and in alignment with, roadway mark segment 18 until the number of system 500 pulses equals the desired mark segment length in image space as previously input by the operator or driver. Program flow then continues to step 1126.
[0336] In step 1126, after the number of system 500 pulses equals the required distance of mark segment 18 having length 24, paint gun 84 is turned off (and also its associated bead gun 88 if previously on) ceasing material dispensing for a number of drive shaft positional sensor 500 pulses equal to the length of the gap segment 20. Another dispensing cycle begins and continues until the desired entire length of center skip line 12 has been restriped.
[0337] Alternatively, the image analysis system 1730 may be instructed by the operator to operate in the “what you see is what you get” mode in which case the roadway mark material is dispensed without regard to any pattern and stripe over the pre-existing roadway marks irrespective of any pre-defined pattern or length of roadway mark.
[0338] In operation, the process of maintenance striping of pre-existing solid line roadway marks using the preferred embodiment of this invention is similar to the above steps except that in steps 1106 and 1112 the first and second points used to create a mark path projection line are derived from intermediary points in array (or object) space. For example, choosing a column of elements of array 836 and determining the lateral mid-point of the roadway mark for every image. Machine vision and machine learning based control system 1700 continually updates the lateral position of carriage 80 and therefore paint gun 84 (and its respective nozzle) and bead gun 88 to continuously dispense roadway mark material directly over and onto a solid roadway mark segment. In this case step 1126 has the length of the pre-existing solid pattern. The driver manually terminates the roadway mark dispensing process by depressing the “STOP” key on keyboard 706. Alternately, the image processing program 812 can determine that a solid line(s) (or a passing pattern which consists of one solid and a skip) is being detected and automatically continues dispensing for the length of the solid pattern and them terminates the dispensing of roadway mark material at the ending of the mark.
[0339] It is therefore understood that existing single line, double line or skip-line patterns or combinations thereof may be restriped (rehabilitated) according to the teachings of this invention. The system automatically determines the beginning and ending of the imaged roadway marks and dispenses mark material (paint and bead) at the proper location onto the pre-existing roadway mark from a moving vehicle 50 using machine vision and machine learning technologies.
[0340] The layout-based restriping process will now be discussed using the striping system 700 shown in
[0341] In operation and referring additionally to
[0342] In step 2000, the driver positions vehicle 50 at the beginning of a pre-existing roadway marking before the roadway is repaved. At this point the operator can visually see the pre-existing roadway mark. Operational flow continues to step 2005.
[0343] In step 2005, driver presses the “RECORD” button on the keyboard 706 and proceeds to drive along side of the pre-existing roadway mark. Operational flow then continues to step 2010.
[0344] In step 2010 and in response to the depressing of the “RECORD” button, system 700 begins to continuously acquire images of the pre-existing roadway marks and tags each acquired image with both a GPS location derived from GPS RTK enabled receiver 154 and time of acquisition derived from PTP timing unit 111 (or other timing source such as the real time clock of computer 703). The pixels for each image have a GPS location. Each image is then undistorted using image correction program 808. Both the raw image and undistorted image along with their respective GPS locations of the detected mark and the beginning and ending of the roadway mark and time stamp are stored in data memory 806. Operational flow continues to step 2014.
[0345] In step 2014, supervised machine learning network (algorithm) 815 classifies the individual pixels of the corrected images and creates array space roadway mark data and saves this data to memory 806 (memory 806 now comprises a sequence of raw image data, matching corrected image data, and matching array data along with GPS locations and respective time stamps). Operational flow then continues to step 2016.
[0346] In step 2016, the sequence of individual raw, undistorted roadway surface images and array space data of the marks is stitched together to form a first best-fit continuous roadway mark path 3050 (see
[0347] In step 2018, the driver depresses the “STOP” button on keyboard 706 at the end of the roadway markings to terminate the recording process. This completes the recording portion of the layout-based restriping process. A complete continuous best-fit first roadway mark path 3050 having array space roadway mark data along with their respective GPS location and respective time stamp data and roadway mark type is now stored in memory 806.
[0348] The following discussion now assumes that the roadway is repaved (completely erasing the original roadway markings) and that either layout marks (small dashes or the like) or the original roadway markings will be striped on top of the new roadway surface substantially along the original roadway mark path as determined by the continuous best-fit roadway mark path 3050. The operational flow continues to step 2020.
[0349] In step 2020, the driver positions vehicle 50 at the beginning of the first roadway mark and depresses either the “LAYOUT” button or the “LAYOUT STRIPE” button on the keyboard 706 and proceeds to drive along the first roadway mark path as determined by the best-fit path (determined by the previously stored GPS location data) and indicated on the display 704. Operational flow continues to step 2025.
[0350] In step 2025 and if the driver depresses the LAYOUT button, the previously stored GPS location data of the best fit roadway mark path is used to position the carriage 80 over the roadway mark path 3050 continuous function and a sequence of layout marks 3000, 3005, 3010, 3015, 3020, 3025 (usually one to two-foot-long) are applied onto the repaved roadway surface as shown in
[0351] The distance between the layout marks is usually 10 feet but may be adjusted according to a striping contractor specification. The distance travelled between the layout marks and the length of the layout marks may be programmed by program 812. Operational flow continues to step 2030.
[0352] Alternately in step 2025, if the driver depresses the LAYOUT STRIPE button the actual original and recorded roadway mark is duplicated over the best-fit roadway mark path 3050 using the roadway mark type and GPS location data previously stored in memory 806. Operational flow continues to step 2030.
[0353] In step 2030, the vehicle 50 ends the dispensing of the layout marks or actual roadway markings at the end of the roadway mark path.
[0354] Although illustrated and described above with reference to certain specific embodiments and examples, the present invention is nevertheless not intended to be limited to the details shown. Rather, various modifications may be made in the details within the scope and range of equivalents of the claims and without departing from the spirit of the invention. It is expressly intended, for example, that all ranges broadly recited in this document include within their scope all narrower ranges which fall within the broader ranges.
[0355] It is also understood that the striping system 700 combines both the maintenance-based and layout-based striping processes and may be located on a single vehicle as shown in