TRACTOR-BASED TRAILER CLEARANCE AND POSITIONING SYSTEM AND METHOD
20230162509 · 2023-05-25
Inventors
- Stanley W DeLizo (Lynnwood, WA, US)
- Ian David O'Connor (Seattle, WA, US)
- August Avantaggio (Kent, WA, US)
- Kyle Christopher Wynn Johnson (Vancouver, WA, US)
Cpc classification
B60R11/04
PERFORMING OPERATIONS; TRANSPORTING
G06V10/44
PHYSICS
G01B17/00
PHYSICS
International classification
G06V20/56
PHYSICS
B60R11/04
PERFORMING OPERATIONS; TRANSPORTING
G01B17/00
PHYSICS
Abstract
The present disclosure describes systems and methods for automated determination of certain physical characteristics of a trailer in a tractor-trailer truck and positional arrangement between the trailer and tractor of the truck. The technology may include a camera mounted on the tractor to acquire an image of at least a rear portion of the trailer; a sensor configured to acquire information relating to the trailer angle; a processor configured to determine a position of the rear portion of the trailer in the image, and determine the length of the trailer based at least in part on the determined position of the rear portion of the trailer in the image and the information relating to the trailer angle.
Claims
1. A system for determining a length of a trailer hitched to a tractor in a tractor-trailer truck, the system comprising: a camera positioned relative to the tractor and configured to acquire an image of at least a portion of the trailer, the imaged portion including at least a rear portion of the trailer; a sensor configured to acquire information relating to relative arrangement between the tractor and trailer at substantially the same time as the acquisition of the image; a processor configured to: receive from the camera the image; receive from the sensor the information relating to relative arrangement between the tractor and trailer; determine a position of the rear portion of the trailer in the image; and determine the length of the trailer based at least in part on the determined position of the rear portion of the trailer in the image and the information relating to the relative arrangement between the tractor and trailer.
2. The system of claim 1, wherein the information relating to relative arrangement between the tractor comprises an angle between a longitudinal axis of the tractor and a longitudinal axis of the trailer.
3. The system of claim 2, wherein the sensor comprises a distance sensor array configured to measure distances between the sensor array and a portion of the trailer, and wherein the processor is configured to calculate the angle between the longitudinal axis of the tractor and the longitudinal axis of the trailer based at least in part on the measured distances.
4. The system of claim 2, wherein the trailer has a wheelbase, and wherein the processor is further configured to determine the wheelbase of the trailer based at least in part on the angle between the longitudinal axis of the tractor and the longitudinal axis of the trailer.
5. The system of claim 4, wherein the processor is configured to determine the wheelbase of the trailer without using any information from any image acquired by the camera.
6. The system of claim 4, wherein the processor is further configured to set one or more parameters of the tractor's operation based at least in part on the angle between the longitudinal axis of the tractor and the longitudinal axis of the trailer and the wheelbase.
7. The system of claim 1, wherein the processor is further configured to process the images to detect edges within the images and use at least one of the detected edges to identify the rear portion of the trailer.
8. A truck computer adapted to be installed in a tractor, the truck computer comprising: at least one processor; and a memory operatively connected to the at least one processor, the memory storing instructions that when executed by the at least one processor, and when the truck computer is installed in a tractor and a trailer is hitched to the tractor, cause the processor to carry out a process comprising: receiving image data; receiving a sensor signal; identifying from the image data a rear portion of the trailer; determining a position of the rear portion of the trailer in an image the image data represent; determining a relative arrangement between the tractor and trailer based at least in part on the sensor signal; determining the length of the trailer based at least in part on the determined position of the rear portion of the trailer in the image and the relative arrangement between the tractor and trailer.
9. The truck computer of claim 8, wherein the relative arrangement between the tractor comprises an angle between a longitudinal axis of the tractor and a longitudinal axis of the trailer.
10. The truck computer of claim 9, wherein the process further comprises determining a wheelbase of the trailer based at least in part on the angle between the longitudinal axis of the tractor and the longitudinal axis of the trailer.
11. The truck computer of claim 10, wherein the process further includes acquiring a parameter of a state of operation of the tractor, wherein determining the wheelbase includes determining a wheelbase of the trailer further based on the parameter.
12. The truck computer of claim 11, wherein the parameter is speed of the truck.
13. The truck computer of claim 8, wherein determining a position of the rear portion of the trailer includes detecting edges within the images and use at least one of the detected edge to identify the rear portion of the trailer.
14. The truck computer of claim 8, wherein the process further includes determining a parameter of an aspect of operation of the tractor.
15. The truck computer of claim 14, wherein determining a parameter of an aspect of operation of the tractor includes determining at least one limit of a turn radius of the truck.
16. A method for determining a length of a trailer hitched to a tractor in a tractor-trailer truck, the method including: acquiring, using a camera on the tractor, an image of at least a portion of the trailer, the imaged portion including at least a rear portion of the trailer; determining, using a processor, a position of the rear portion of the trailer in the image; acquiring, using a sensor on the tractor, information relating to relative arrangement between the tractor and trailer at substantially the same time as the acquisition of the image; and determining, using a processor, the length of the trailer based at least in part on the determined position of the rear portion of the trailer in the image and the information relating to the relative arrangement between the tractor and trailer.
17. The method of claim 16, where acquiring information relating to relative arrangement between the tractor comprises acquiring, using a sensor, an angle between a longitudinal axis of the tractor and a longitudinal axis of the trailer.
18. The method of claim 17, wherein acquiring the angle comprises using a distance sensor array to measure distances between the sensor array and a portion of the trailer, and calculate the angle between the longitudinal axis of the tractor and the longitudinal axis of the trailer based at least in part on the measured distances.
19. The method of claim 17, further comprising determine a wheelbase of the trailer based at least in part on the angle between the longitudinal axis of the tractor and the longitudinal axis of the trailer.
20. The method of claim 17, wherein determining a position of the rear portion of the trailer in the image includes processing the image to detect edges within the images and using at least one of the detected edges to identify the rear portion of the trailer.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] Non-limiting and non-exhaustive examples are described with reference to the following figures.
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021] While examples of the disclosure are amenable to various modifications and alternative forms, specific aspects have been shown by way of example in the drawings and are described in detail below. The intention is not to limit the scope of the disclosure to the particular aspects described.
DETAILED DESCRIPTION
[0022] The automotive industry is facing increasing demands for vehicles with self-driving and computer-assisted-driving capabilities. Designing and developing such vehicles present numerous technological challenges. Some of those challenges are unique to a commercial trucking industry. For example, for tractor-trailer (or “semi”) trucks, certain maneuvers, such as backing a trailer into a docking position in a loading dock, can be complex due to the changing angular relationship between the tractor and trailer. To automatically maneuver, or provide computer-assisted maneuvering of, a tractor-trailer truck into a desired position in a safe and efficient manner, it is important to ascertain certain characteristics of the trailer. Such characteristics can include, for example, the length of the trailer and the wheelbase of the trailer. The example trailer clearance and positioning systems and methods described below provides automated determination of certain characteristics of trailers attached to a tractor based on signals received by camera(s) or sensor(s) mounted on the tractor. As a tractor is typically designed to have a variety of trailers attached to it, it is the most efficient to have a trailer clearance and positioning system mounted in a tractor instead of each trailer.
[0023] In some embodiments such as the example shown in
[0024] The trailer 120 in this example includes a trailer frame 122, which in this example is of a rectangular shape having a length/and width w. However, the trailer frame 122 can be of other shapes suited for specific applications. For example, the trailer frame 122 can be of a circular cylindrical or elliptical cylindrical shape for transportation of a liquid payload. The trailer frame 122 also includes a rear wheel assembly 130, including one or more rear axles 132a, 132b and wheels 134 mounted on the rear axles 132a, 132b. The trailer frame 122 further includes a coupler (not shown) for engaging the trailer 120 with the kingpin 114 and permitting the trailer 120 to pivot about the kingpin 114 such that the trailer angle, i.e., the angle between the longitudinal axis 124 of the trailer 120 and the longitudinal axis 118 of the tractor 110, can vary as the tractor 110 pulls or pushes the trailer 120 in directions not parallel to the longitudinal axis 124 of the trailer 120. The trailer 120 is also characterized by a wheelbase WB, which is defined as the distance between the point of engagement with the kingpin 114 and the center of the rear wheel assembly 130. The trailer 120 is further characterized by a distance, d.sub.p, between the front end of the trailer frame 122 and the pointed placement with the kingpin 114.
[0025] The tractor 110 in this example is equipped with a trailer clearance and positioning system 140, which includes one or more cameras 142 (as shown, including cameras 142a, 142b) and one or more sensors 146 (as shown, including sensors 146a, 146b, 146c, 146d, 146e, 146f). The cameras 142a, 142b in this example are mounted one on each side of the tractor 110, for example by attachment to, or being incorporated into, the side mirror assemblies 116a, 116b. The one or more cameras 142a, 142b in this example are aimed to the rear of the vehicle such that the rear end of the trailer 120 comes into the field-of-view of the cameras 142 at least for some range of the trailer angle. The one or more cameras 142 can be positioned at any location on the tractor so long as the part or parts of the trailer 122 be used to determine the size (e.g., length) of the trailer 120 come into the field-of-view of the cameras at least for some range of the trailer angle. For example, the camera(s) can be mounted on the back of the tractor cab.
[0026] The one or more sensors 146 in this example are located at the rear end of tractor cab 120 and, as described in more detail below, are configured to determine the trailer angle, i.e., the angle between the longitudinal axis 118 of the tractor 110 and the longitudinal axis 124 of the trailer 120.
[0027] In this example, the camera 142a is positioned at a distance of C.sub.x from the longitudinal axis 118 and a distance of C.sub.y from a line perpendicular to the longitudinal axis 118 and passing through the kingpin 114. The values of C.sub.x and C.sub.y, as described in more detail below, are used in calculating certain characteristics, such as length, of the trailer 120.
[0028] In some embodiments, such as the one depicted in
[0029] The CPU 212 and the truck computer 214 are linked in this example by such data communication link 218, which in one example is a controller area network (CAN) link based on the Society of Automotive Engineers (SAE) J1939 protocol. The trailer angle calculated by the angle calculation unit 216 is transmitted to the CPU 212 in this example by a Universal Asynchronous Receiver-Transmitter (UART) device, which can be a part of the input/output (I/O) structure of the angle calculation unit 216. The signals from the sensors 146 in this example are analog signals 224, which are fed to the analog inputs of the angle calculation unit 216; in alternative embodiments, digital inputs may be provided to the angle calculation unit as well or in place of such analog inputs. Images captured by the cameras 142 is transmitted to the CPU 212 in this example via a serial link, such as a Universal Serial Bus (USB) link.
[0030] The CPU 212, as described in more detail below, carries out the process of calculating the trailer angle, trailer length and wheelbase based at least in part on its the images acquired by the camera 142 and/or the trailer angle calculated by such the annual calculation unit 216 based on the signals acquired by the sensors 146 and/or vehicle information (e.g., tractor speed and turn angle (i.e., the position of the steering wheel)) supplied by the truck computer 214. The CPU 212 further supplies the calculated values to the truck computer 214 via the data communication link 218 to enable the truck computer 214 to calculate control parameters such as minimum and the maximum turn radii in four directions (forward-left, forward-right, backward-left, and backward-right) and to generate output signals based on the calculated control parameters to control the tractor to autonomously maneuver the truck or to guide a driver to maneuver the truck.
[0031] The CPU 212 can be any processor, capable of carrying out image processing and other computational tasks for specific applications, with appropriate peripheral circuits. The truck computer 214 can be any processor suitable for requisite vehicle monitoring and control. Such computers are commercially available and typically installed in trucks as sold. The angle calculation unit 216 can be, or be included in, any suitable processor, including a microprocessor or microcontroller. In this example, the sensors 146, as described in more detail below, are ultrasonic distance sensors, and the angle calculation unit 216 is a microcontroller capable of calculating the trailer angle from the distance measurements by the sensors 146.
[0032] Although the processor 210 in this example includes three different processors (CPU 212, truck computer 214, and angle calculation unit 216), any suitable number of processors can be included. For example, the functionalities of the CPU 212 and truck computer 214 can be included in a single truck computer; alternatively, the functionalities of all three processors 212, 214, 216 can be included in a single truck computer.
[0033] With reference to
[0034] In this example, a portion on the left side (from the driver's perspective) 128 of the trailer 120 is imaged, and the left rear corner 310 of the trailer 120 is captured in the one or more images. The various quantities and variables denoted in
[0045] Based on purely geometric considerations, the following can be established:
L1=C.sub.y/cos(θ.sub.f/2);
L2=(C.sub.x−a)sin(90−θ.sub.t)/sin(θ.sub.t+θ.sub.f/2);
η=θ.sub.t+θ.sub.f/2−θ.sub.e;
T1=(C.sub.x−a)sin(90−θ.sub.f/2)/sin(θ.sub.t+θ.sub.f/2);
T2=(L1+L2)sin(θ.sub.e)/cos(η);
and the total length, l, is given by:
l=T1+T2+b+d.sub.p=ƒ(θ.sub.t,θ.sub.e).
That is, the total length, l, is a function of the turning angle, θ.sub.t, and the angle, θ.sub.e, between the edge 324 of the field-of-view and the line 328 connecting the camera 142a and the left rear corner 310 of the trailer 120. θ.sub.t, as described in detail below, can be measured with the sensors 146; θ.sub.e can be measured by the horizontal position of the left rear corner 310 of the trailer 120 in the images capture by the camera 142a. As the total number of pixels in the horizontal direction in the images correspond to the angle, θ.sub.f, spanning the field-of-view, the number of pixels from the left edge of the images to the left rear corner 310 in the images corresponding to θ.sub.e, and θ.sub.e can therefore be calculated by the processor 210 (more specifically, the CPU 212 in the example shown in
[0046] In some embodiments, the trailer angle can be measured using the sensors 146, as shown in the example in
[0047] It is noted that although six sensors are used in the examples given above, other numbers of sensors can be used. For example, a single distance sensor may be used to determine the trailer angle by first measuring an initial distance for a known trailer angle (e.g., θ.sub.t=0°) and subsequently measuring distances and determining the corresponding trailer angles based on the measured distances and the initial distance. Moreover, other types of distance sensors and other types of sensors in general can be used to determine the trailer angle. For example, infrared depth cameras mounted on a tractor may be used to determine distances between the cameras and the front edge of a trailer in order to determine the trailer angle. As another example, digital cameras can be used to capture images of the back edge of the tractor cab and the front edge of the trailer from above, and the images can be processed by a processor to determine the trailer angle. For example, the so-called Canny edge detection algorithm can be used by the processor to locate the back edge of the tractor cab in the front edge of the trailer, and the angle between the edges can then be calculated by the processor.
[0048]
[0049]
[0050] Although the top edge 610 is used to locate the package of the trailer 120 in the example shown in
[0051] In some embodiments, the wheelbase (WB) of a trailer can be determined using the measured trailer angles and/or images captured of the trailer. In one example, illustrated in
X=∫.sub.t.sub.
WB=X/tan(θ.sub.t(t.sub.0)).
[0052] It is noted that the tractor speed, V, and information regarding whether the tractor is driving straight (i.e., the turn angle) and other information on the tractor's operating condition can be supplied from the truck computer 214.
[0053] It is further noted that the end time of the integration used in the above example is the time when the trailer angle becomes zero, but that the end time can be any time. The non-zero trailer angle at the end time would introduce some additional complexity in trigonometric calculation but would be straightforward. Furthermore, the tractor 110 needs not be driven in a straight line; as the movement of the center point 136 is subject to the constraint that it is at a distance of WB from the kingpin 114, WB can be computed from the trailer angle as a function of the tractor motion.
[0054] In some embodiments, as outlined in
[0055] An example of the method outlined above includes the following: [0056] When the truck begins making a left turn, a rear-facing side-mirror mounted camera is activated to acquire images; [0057] At substantially the same time as the images are acquired, distance sensors are activated to determine the trailer angle; [0058] Each of the images is processed: [0059] Convert the acquired images to grayscale; [0060] Gaussian Blur filter is applied to remove noise from the images. [0061] A Canny edge detection algorithm is run to locate edges in each image (example:
[0066] As alluded to above, in some embodiments, certain processes described above are carried out by a computer system, such as an onboard computer system. Such a computer system in some embodiment includes one or more special-purpose computers, which can be one or more general-purpose computers specifically programmed to perform the methods. For example, a computer 900 schematically shown in
[0067] The embodiments described herein may be employed using software, hardware, or a combination of software and hardware to implement and perform the systems and methods disclosed herein. Although specific devices have been recited throughout the disclosure as performing specific functions, one of skill in the art will appreciate that these devices are provided for illustrative purposes, and other devices may be employed to perform the functionality disclosed herein without departing from the scope of the disclosure. In addition, some aspects of the present disclosure are described above with reference to block diagrams and/or operational illustrations of systems and methods according to aspects of this disclosure. The functions, operations, and/or acts noted in the blocks may occur out of the order that is shown in any respective flowchart. For example, two blocks shown in succession may in fact be executed or performed substantially concurrently or in reverse order, depending on the functionality and implementation involved.
[0068] This disclosure describes some embodiments of the present technology with reference to the accompanying drawings, in which only some of the possible embodiments were shown. Other aspects may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments were provided so that this disclosure was thorough and complete and fully conveyed the scope of the possible embodiments to those skilled in the art.
[0069] Further, as used herein and in the claims, the phrase “at least one of element A, element B, or element C” is intended to convey any of: element A, element B, element C, elements A and B, elements A and C, elements B and C, and elements A, B, and C. In addition, one having skill in the art will understand the degree to which terms such as “about” or “substantially” convey in light of the measurements techniques utilized herein. To the extent such terms may not be clearly defined or understood by one having skill in the art, the term “about” shall mean plus or minus ten percent.
[0070] Although specific embodiments are described herein, the scope of the technology is not limited to those specific embodiments. One skilled in the art will recognize other embodiments or improvements that are within the scope and spirit of the present technology. In addition, one having skill in the art will recognize that the various examples and embodiments described herein may be combined with one another. Therefore, the specific structure, acts, or media are disclosed only as illustrative embodiments. The scope of the technology is defined by the following claims and any equivalents therein.