APPARATUS AND METHOD FOR GRADE CONTROL
20220333339 · 2022-10-20
Inventors
Cpc classification
E02F3/7618
FIXED CONSTRUCTIONS
E02F9/262
FIXED CONSTRUCTIONS
E02F3/434
FIXED CONSTRUCTIONS
E02F3/845
FIXED CONSTRUCTIONS
International classification
E02F3/84
FIXED CONSTRUCTIONS
E02F3/76
FIXED CONSTRUCTIONS
Abstract
A work machine including a frame, a ground-engaging element movably coupled to the frame, a movable grading element, an actuator coupled to the movable grading element to controllably drive movement of the movable grading element engaging material to be graded, a sensor, and a controller. The controller comprises a memory that stores computer-executable instruction and a processor that executes the instructions. The instructions include labelling each of at least a plurality of pixels in a first image as a visual marker; selecting the first image as a reference keyframe; tracking at least one region of a subsequent image including the visual marker relative to the reference keyframe to determine an estimate of the current pose as the work machine moves; and adjusting a position of the movable grading element.
Claims
1. A work machine comprising: a frame; a ground-engaging element movably coupled to the frame and driven by a power source to drive movement of the work machine; a movable grading element movably supported by the frame to move relative to the frame; an actuator coupled to the movable grading element to controllably drive movement of the movable grading element to engage material to be graded; a sensor coupled to the work machine that captures a plurality of images in a field of view; and a controller coupled to the sensor, the controller comprising a memory that stores computer-executable instructions and a processor that executes the instructions, to: label each of at least a plurality of pixels in a first image as a visual marker; select the first image as a reference keyframe; track at least one region of a subsequent image including the visual marker relative to the reference keyframe to determine an estimate of a current pose as the work machine moves; and adjust a position of the movable grading element with the actuator to achieve a desired grade based on the tracking of the visual marker.
2. The work machine of claim 1, wherein the tracking of the visual marker in a vertical direction is indicative of a cutting depth of the material.
3. The work machine of claim 1, wherein the processor labels a new first image at the beginning of each pass and selects the new first image as the reference keyframe.
4. The work machine of claim 3, wherein the new first image includes the visual marker from the first image.
5. The work machine of claim 1, wherein adjusting the position of the movable grading element includes one or more of a pitch, a yaw, and a roll of the movable grading element.
6. The work machine of claim 1, wherein the processor further logs a tracking data and fuses the tracking data from the first image and the subsequent image to map three-dimensional movement of the visual marker.
7. The work machine of claim 6, wherein the mapping of the visual markers include stitching together reference keyframes.
8. The work machine of claim 1, wherein the visual marker comprises a constellation of visual markers.
9. The work machine of claim 8, wherein the constellation of visual markers is used to stitch together keyframes, and thereby mapping the field of view.
10. The work machine of claim 1, further comprising: an inertial measurement unit coupled to the work machine wherein the inertial measurement unit provides a gravitation reference.
11. A method of controlling a work machine, the method comprising: capturing a plurality of images with a sensor coupled to the work machine in a field of view of the work machine; labeling each at least of a plurality of pixels as a visual marker in a first image from the plurality of images; selecting the first image as a reference keyframe; tracking at least one region of a subsequent image including the visual marker from the plurality of images relative to the reference keyframe to determining an estimate of a current pose as the work machine moves; determining a movement of the movable grading element required; and adjusting the position of the movable grading element with an actuator to achieve a desired grade based on the tracking of the visual marker.
12. The method of claim 11 wherein the tracking of the visual mark in a vertical direction is indicative of a cutting depth of a material.
13. The method of claim 11, wherein the processor labels a new first image at the beginning of each pass and selects the new first image as the reference keyframe.
14. The method of claim 13, wherein the new first image includes the visual marker from the first image.
15. The method of claim 11, wherein adjusting the position of the movable grading element includes one or more of a pitch, a yaw, and a roll of the movable grading element.
16. The work machine of claim 11, wherein the method further comprises: logging a tracking data; fusing the tracking data from the first image and the subsequent image; and mapping three-dimensional movement of the visual marker.
17. The method of claim 16, wherein the mapping of the visual markers is used to stitch together keyframes, and thereby map the grade of each pass.
18. The method of claim 1 further comprising: selecting a new first image as the reference keyframe at the beginning of each pass, determining the visual markers in the new first image; identifying a common visual marker from the new first image and the first image wherein the common visual marker is labeled as an external reference; comparing the external reference from the new first image to the first image; and adjusting the position of movable grading element.
19. The method of claim 11, wherein the visual marker comprises a constellation of visual markers.
20. The method of 18, wherein a constellation of visual markers is used to stitch together keyframes, and thereby map a field of view.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017] Before any embodiments are explained in detail, it is to be understood that the disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The disclosure is capable of supporting other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.
DETAILED DESCRIPTION
[0018] As used herein, unless otherwise limited or modified, lists with elements that are separated by conjunctive terms (e.g., “and”) and that are also preceded by the phrase “one or more of” or “at least one of” indicate configurations or arrangements that potentially include individual elements of the list, or any combination thereof. For example, “at least one of A, B, and C” or “one or more of A, B, and C” indicates the possibilities of only A, only B, only C, or any combination of two or more of A, B, and C (e.g., A and B; B and C; A and C; or A, B, and C).
[0019] As used herein, the term “controller” is a computing device including a processor and a memory. The “controller” may be a single device or alternatively multiple devices. The controller may further refer to any hardware, software, firmware, electronic control component, processing logic, processing device, individually or in any combination, including without limitation: application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
[0020] The term “processor” is described and shown as a single processor. However, two or more processors can be used according to particular needs, desires, or particular implementations of the controller and the described functionality. The processor may be a component of the controller, a portion of the object detector, or alternatively a part of another device. Generally, the processor can execute instructions and can manipulate data to perform the operations of the controller, including operations using algorithms, methods, functions, processes, flows, and procedures as described in the present disclosure.
[0021]
[0022]
[0023]
[0024] Now referring to
[0025] Upon starting a new cutting pass 300, the processor 195 repeats the procedure of acquiring 512 a new first image 516, which includes one or more visual markers 335 from the first image 513, labels 505 a plurality of pixels 510 in the new first image 516 at the beginning of each pass 350 as a visual marker 335, and selects 515 the new first image 516 as the reference keyframe 520. As shown in
[0026] The processor 195 may further log 525 the data 550 acquired from tracking the visual marker 335 (hereinafter referred to as tracking data 550) and may fuse 555 the tracking data 550 using fusing logic 557 from the first image 513 and the subsequent image 415 to map either a two-dimensional or a three-dimensional movement of the visual marker 335 over a distance, thereby generating a model 570 of the worksite 150.
[0027] Tracking of the visual marker 335 in a vertical direction 560 is indicative of a cutting depth 138 of the material which may correlate to a change in grade 158 as shown in
[0028] Mapping of the visual markers 335 includes stitching together the reference keyframes 520.
[0029] The constellation 355 of visual markers 335 is used to align together reference keyframes 520 and images 405 (as shown in
[0030] An inertial measurement unit 162 may be coupled to the work machine and provide a gravitation reference. Fusing 555 the tracking data 550 with output from an inertial measurement unit 162 may strengthen the ability to track motion with respect to the external reference frame (consisting about external reference points 340). Keyframes 520 may be logged in a multitude of orientations and the current position and heading of the work machine 100 may be used to stitch the various reference keyframes 520 and images 405 to correlate the visual markers 335, and therefore map 565 grading and grading productivity on a worksite 150. This advantageously provides a full three-dimensional modeling (shown in
[0031] Now turning to
[0032] The method may further include generating a model 570 by logging the tracking data in step 525, derived from tracking 535 a region of an image, fusing 555 the tracking data 550 from the first image 513 and a subsequent image 415, and mapping three-dimensional movement of the visual marker 335 as the work machine 100 moves. Vehicle speed 582 may also be used to generate a model 570 of grading performed.