Sensor integration for large autonomous vehicles
12248323 ยท 2025-03-11
Assignee
Inventors
Cpc classification
B60R1/12
PERFORMING OPERATIONS; TRANSPORTING
G05D1/247
PHYSICS
B60R2001/1223
PERFORMING OPERATIONS; TRANSPORTING
G01S17/87
PHYSICS
G01S17/86
PHYSICS
G05D2111/50
PHYSICS
G05D1/249
PHYSICS
International classification
G01S17/86
PHYSICS
B60R1/12
PERFORMING OPERATIONS; TRANSPORTING
G01S13/86
PHYSICS
G01S17/87
PHYSICS
G01S7/481
PHYSICS
G05D1/00
PHYSICS
G05D1/247
PHYSICS
Abstract
The technology relates to autonomous vehicles for transporting cargo and/or people between locations. Distributed sensor arrangements may not be suitable for vehicles such as large trucks, busses or construction vehicles. Side view mirror assemblies are provided that include a sensor suite of different types of sensors, including LIDAR, radar, cameras, etc. Each side assembly is rigidly secured to the vehicle by a mounting element. The sensors within the assembly may be aligned or arranged relative to a common axis or physical point of the housing. This enables self-referenced calibration of all sensors in the housing. Vehicle-level calibration can also be performed between the sensors on the left and right sides of the vehicle. Each side view mirror assembly may include a conduit that provides one or more of power, data and cooling to the sensors in the housing.
Claims
1. A vehicle configured to operate in an autonomous driving mode, the vehicle comprising: a pair of side sensor assemblies attached to opposite sides of the vehicle, each side sensor assembly including: an exterior housing; a mounting element configured to rigidly secure the exterior housing to a corresponding side of the vehicle; a short range light detection and ranging (lidar) sensor aligned along a common axis of the exterior housing, the short range lidar sensor configured to provide a first field of view (i) having a first detection range along the corresponding side of the vehicle and (ii) encompassing a selected area adjacent to the vehicle; a long range lidar sensor aligned along the common axis, the long range lidar sensor configured to provide a second field of view (i) having a second detection range along the corresponding side of the vehicle of at least 50 meters and (ii) encompassing the selected area, the short range lidar sensor and the long range lidar sensor further configured to provide redundancy of coverage for the selected area; and a control system having one or more computer processors configured to receive data from the pair of side sensor assemblies and to direct operation of the vehicle, when operating in the autonomous driving mode, based on the received data.
2. The vehicle of claim 1, wherein the selected area is adjacent to the corresponding side of the vehicle.
3. The vehicle of claim 1, wherein the selected area is adjacent to a front of the vehicle.
4. The vehicle of claim 1, wherein the first detection range is no more than 20 meters.
5. The vehicle of claim 1, wherein each side sensor further comprises: a first camera disposed adjacent to the long range lidar sensor and configured to obtain imagery towards a front of the vehicle; and a second camera disposed adjacent to the long range lidar sensor and configured to obtain imagery along the corresponding side of the vehicle.
6. The vehicle of claim 5, further wherein each side sensor further comprises a third camera configured to obtain imagery towards a rear of the vehicle.
7. The vehicle of claim 1, wherein each side sensor further comprises a radar sensor.
8. The vehicle of claim 7, wherein the radar sensor is disposed within the exterior housing between the long range lidar sensor and short range lidar sensor.
9. The vehicle of claim 7, wherein the radar sensor comprises: a first radar sensor configured to obtain radar returns along a front of the vehicle; a second radar sensor configured to obtain radar returns along the corresponding side of the vehicle; and a third radar sensor configured to obtain radar returns along a rear of the vehicle.
10. The vehicle of claim 1, wherein: the vehicle is a tractor-trailer vehicle including a cab and at least one trailer; the exterior housing of a first side sensor assembly of the pair of side sensor assemblies is rigidly secured to a left side of the cab; and the exterior housing of a second side sensor assembly of the pair of side sensor assemblies is rigidly secured to a right side of the cab.
11. The vehicle of claim 1, wherein the long range lidar sensor disposed in a first one of the pair of side sensor assemblies is configured to provide redundancy for the long range lidar sensor disposed in a second one of the pair of side sensor assemblies along a region external to the vehicle within a selected field of view.
12. The vehicle of claim 1, further comprising a first conduit at least partly received within the exterior housing, the first conduit providing one or more of a power line or a data line to the long range lidar sensor and the short range lidar sensor and configured for connection to one or more of operational systems of the vehicle.
13. The vehicle of claim 12, further comprising a second conduit at least partly received within the exterior housing, the second conduit providing at least one of a cooling line configured to provide cooling to at least one of the long range lidar sensor or the short range lidar sensor, or a heating line configured to provide heating to at least one of the long range lidar sensor or the short range lidar sensor, the second conduit configured for connection to one or more of the operational systems of the vehicle.
14. A side sensor assembly for use on a truck or bus capable of operating in an autonomous driving mode, the side sensor assembly comprising: an exterior housing; a mounting element configured to rigidly secure the exterior housing to a corresponding side of the truck or bus; a long range light detection and ranging (lidar) sensor aligned along a common axis of the exterior housing, the long range lidar sensor configured to provide a first detection range of at least 50 meters; and a short range lidar sensor aligned along the common axis of the exterior housing, the short range lidar sensor configured to provide a second detection range, wherein the long range lidar sensor and the short range lidar sensor are configured to provide complementary information corresponding to a selected area adjacent to the truck or bus.
15. The side sensor assembly of claim 14, wherein each side sensor further comprises: a first camera disposed adjacent to the long range lidar sensor and configured to obtain imagery along a front of the truck or bus; and a second camera disposed adjacent to the long range lidar sensor and configured to obtain imagery along a side of the truck or bus.
16. The side sensor assembly of claim 15, wherein each side sensor further comprises a radar sensor disposed within the exterior housing between the long range lidar sensor and short range lidar sensor.
17. The side sensor assembly of claim 15, wherein the long range lidar sensor is configured to provide redundancy for another long range lidar sensor disposed in another side sensor assembly of the truck or bus, along a region external to the truck or bus within a selected field of view.
18. The side sensor assembly of claim 15, further comprising a conduit at least partly received within the exterior housing, the conduit providing one or more of a power line or a data line to the long range lidar sensor and the short range lidar sensor and configured for connection to one or more of operational systems of the truck or bus.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
DETAILED DESCRIPTION
Overview
(8) The technology relates to autonomous or semi-autonomous vehicles for transporting cargo and/or people between locations. Large trucks, busses and construction equipment, unlike passenger cars, typically do not provide good 360 visibility from a single vantage point. For instance,
(9) Such large vehicles may have multiple blind spot areas on the sides and to the rear. Placing sensors on top of the truck cab or trailer, or on the roof of the bus, may not resolve the blind spot issue, and may or may not be feasible. For example, given the heights of such vehicles, it may be impractical to locate sensors on the roof or top due to low clearance bridges, underpasses, tunnels, parking structures, etc. This may limit routes available to the vehicle. It may also be difficult to maintain or service sensors placed on top of large vehicles.
(10) One way to address certain blind spot issues is via side view mirror assemblies. The side view mirror assemblies on large trucks and busses can be placed towards the front of the vehicle. These assemblies can be secured by one or more bracket elements, and project away from the vehicle to the side and/or front, for instance as shown in the top views of
(11) There are different degrees of autonomy that may occur in a partially or fully autonomous driving system. The U.S. National Highway Traffic Safety Administration and the Society of Automotive Engineers have identified different levels to indicate how much, or how little, the vehicle controls the driving. For instance, Level 0 has no automation and the driver makes all driving-related decisions. The lowest semi-autonomous mode, Level 1, includes some drive assistance such as cruise control. Level 2 has partial automation of certain driving operations, while Level 3 involves conditional automation that can enable a person in the driver's seat to take control as warranted. In contrast, Level 4 is a high automation level where the vehicle is able to drive without assistance in select conditions. And Level 5 is a fully autonomous mode in which the vehicle is able to drive without assistance in all situations. The architectures, components, systems and methods described herein can function in any of the semi or fully-autonomous modes, e.g., Levels 1-5, which are referred to herein as autonomous driving modes. Thus, reference to an autonomous driving mode includes both partial and full autonomy.
Example Systems
(12)
(13) The memory 206 stores information accessible by the one or more processors 204, including instructions 208 and data 210 that may be executed or otherwise used by the processor 120. The memory 206 may be of any type capable of storing information accessible by the processor, including a computing device-readable medium. The memory is a non-transitory medium such as a hard-drive, memory card, optical disk, solid-state, tape memory, or the like. Systems may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
(14) The instructions 208 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computing device code on the computing device-readable medium. In that regard, the terms instructions and programs may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. The data 210 may be retrieved, stored or modified by one or more processors 204 in accordance with the instructions 208. As an example, data 210 of memory 206 may store information, such as calibration information, to be used when calibrating different types of sensors.
(15) The one or more processor 204 may be any conventional processors, such as commercially available CPUs. Alternatively, the one or more processors may be a dedicated device such as an ASIC or other hardware-based processor. Although
(16) In one example, the computing devices 202 may form an autonomous driving computing system incorporated into vehicle 100 or 120. The autonomous driving computing system may capable of communicating with various components of the vehicle. For example, returning to
(17) The computing devices 202 may control the direction and speed of the vehicle by controlling various components. By way of example, computing devices 202 may navigate the vehicle to a destination location completely autonomously using data from the map information and navigation system 220. Computing devices 202 may use the positioning system 222 to determine the vehicle's location and the perception system 224 to detect and respond to objects when needed to reach the location safely. In order to do so, computing devices 202 may cause the vehicle to accelerate (e.g., by increasing fuel or other energy provided to the engine by acceleration system 214), decelerate (e.g., by decreasing the fuel supplied to the engine, changing gears, and/or by applying brakes by deceleration system 212), change direction (e.g., by turning the front or other wheels of vehicle 100 or 120 by steering system 216), and signal such changes (e.g., by lighting turn signals of signaling system 218). Thus, the acceleration system 214 and deceleration system 212 may be a part of a drivetrain or other transmission system 230 that includes various components between an engine of the vehicle and the wheels of the vehicle. Again, by controlling these systems, computing devices 202 may also control the transmission system 230 of the vehicle in order to maneuver the vehicle autonomously.
(18) As an example, computing devices 202 may interact with deceleration system 212 and acceleration system 214 in order to control the speed of the vehicle. Similarly, steering system 216 may be used by computing devices 202 in order to control the direction of vehicle. For example, if the vehicle is configured for use on a road, such as a tractor-trailer or a bus, the steering system 216 may include components to control the angle of wheels to turn the vehicle. Signaling system 218 may be used by computing devices 202 in order to signal the vehicle's intent to other drivers or vehicles, for example, by lighting turn signals or brake lights when needed.
(19) Navigation system 220 may be used by computing devices 202 in order to determine and follow a route to a location. In this regard, the navigation system 220 and/or data 210 may store map information, e.g., highly detailed maps that computing devices 202 can use to navigate or control the vehicle. As an example, these maps may identify the shape and elevation of roadways, lane markers, intersections, crosswalks, speed limits, traffic signal lights, buildings, signs, real time traffic information, vegetation, or other such objects and information. The lane markers may include features such as solid or broken double or single lane lines, solid or broken lane lines, reflectors, etc. A given lane may be associated with left and right lane lines or other lane markers that define the boundary of the lane. Thus, most lanes may be bounded by a left edge of one lane line and a right edge of another lane line.
(20) The perception system 224 also includes one or more components for detecting objects external to the vehicle such as other vehicles, obstacles in the roadway, traffic signals, signs, trees, etc. For example, the perception system 224 may include one or more light detection and ranging (LIDAR) sensors, sonar devices, radar units, cameras, inertial (e.g., gyroscopic) sensors, and/or any other detection devices that record data which may be processed by computing devices 202. The sensors of the perception system may detect objects and their characteristics such as location, orientation, size, shape, type (for instance, vehicle, pedestrian, bicyclist, etc.), heading, and speed of movement, etc. The raw data from the sensors and/or the aforementioned characteristics can sent for further processing to the computing devices 202 periodically and continuously as it is generated by the perception system 224. Computing devices 202 may use the positioning system 222 to determine the vehicle's location and perception system 224 to detect and respond to objects when needed to reach the location safely. In addition, the computing devices 202 may perform calibration of individual sensors, all sensors in a particular sensor assembly, or between sensors in different sensor assemblies.
(21) As indicated in
(22)
(23)
(24) A set of cameras 404 may be distributed along the housing 302, for instance to provide forward, side and rear-facing imagery. Similarly, a set of radars 406 may be distributed along the housing 302 to provide forward, side and rear-facing data. And the sensors 408 may include an inertial sensor, a gyroscope, an accelerometer and/or other sensors. Each of the sensors may be aligned or arranged relative to a common axis 409 or physical point within the housing 302. Examples of these sensors are also illustrated in
Example Implementations
(25) In addition to the structures and configurations described above and illustrated in the figures, various implementations will now be described.
(26) As noted above, for large trucks, busses, construction equipment and other vehicles, it may be impractical to place sensors on the roof of the vehicle. The roof can be hard to access and has side view limitations. In addition, mounting various sensors on the roof may interfere with aerodynamic roof fairings. While different sensors could be distributed along the front, sides and rear of the vehicle, this may be costly and require running individual data, power and/or cooling lines to each individual sensor. Furthermore, such a solution could be very hard to implement with legacy vehicles, or when the cab of a truck is capable of operating in an autonomous mode but the trailer is a legacy trailer without the necessary sensors.
(27) Thus, according to one aspect, the sensor housing is integrated into a side view mirror assembly, such as shown in
(28) Assembling the system would include running the conduit from the sensor housing to the truck cab or vehicle chassis. Aggregating the cooling, power and data lines in the conduit, or in separate sub-conduits, and running them to one location on the side of the vehicle significantly simplifies the design, lowers the cost of the components and reduces the time and expense of putting the sensors on the vehicle.
(29) Furthermore, the typical height of the side view mirror for a semi-truck or a bus is on the order of 2 meters or more or less, for instance between 1.5-2.5 meters from the ground. This may be an ideal height for the LIDARs, radars, cameras and other sensors of an integrated sensor tower. And because truck and bus side view mirrors are designed to provide clear lines of sight down the side of the vehicle, the sensors within the housing will enjoy the same visibility. In addition, placing the sensors in the side view mirror assembly protects them from road debris and wheel splash, as the sensors will be at least 1.5-2.5 meters from the ground and away from the wheel wells.
(30) Integrating the sensor housing as part of the side view mirror has the added benefit of avoiding occlusion by a conventional side view mirror. And by conforming to the form factors and placements of side view mirrors, the sensor housing will conform to requirements set forth by the U.S. National Highway Traffic Safety Administration and other governing bodies regarding placement of such elements external to the vehicle. And from a branding standpoint, a common appearance can be provided with a sensor assembly used by various types of large vehicles.
(31) While arranging multiple types of sensors in a side view mirror housing for a large truck or bus may be different than a solution employed for a smaller passenger vehicle, the sensors and algorithms for those sensors that are designed to work with passenger cars can be employed in this new arrangement as well. For instance, the height of the sensors, at around 1.5-2.5 meters, is approximately the height of sensors located on the roof of a sedan or sport utility vehicle.
(32) One advantage of co-locating the sensors in the side view mirror housing is that at from this location there is visibility over the hood of the vehicle and provides more than a 180 FOV for sensors such as LIDARs, radars and cameras. An example of this is shown in
(33) The long range LIDARs may be located along a top or upper area of the sensor housings 502. For instance, this portion of the housing 502 may be located closest to the top of the truck cab or roof of the vehicle. This placement allows the long range LIDAR to see over the hood of the vehicle. And the short range LIDARs may be located along a bottom area of the sensor housing 502 opposite the long range LIDARs. This allows the short range LIDARs to cover areas immediately adjacent to the cab of the truck or the front portion of a bus. This would allow the perception system to determine whether an object such as another vehicle, pedestrian, bicyclist, etc. is next to the front of the vehicle and take that information into account when determining how to drive or turn. Both types of LIDARs may be co-located in the housing, aligned along a common axis.
(34) As illustrated in
(35)
(36) In addition to the cost benefits and reduction in installation time, another benefit to co-locating the LIDAR, radar, camera and/or other sensors in a side view mirror housing involves calibration. Placing these sensors in the same housing means that they are all subject to the same relative movement, as they may be affixed within the housing relative to a common axis or reference point of the housing. This reduces the complexity involved in calibrating each sensor individually and with respect to the other co-located sensors. Calibration of all sensors in one of the side view mirror housings can be done for the whole assembly so that everything is referenced to itself. This is easily accomplished because all sensors in the housing can be rigidly mounted with respect to each other.
(37) Furthermore, vehicle level calibration between left and right side sensor housings can be accomplished by matching features (e.g., convolution) in front of the vehicle, or other overlapping data points. Knowing where the features are with respect to the vehicle also gives the system extrinsic calibrations. And for sensor subsystems, such an inertial sensor subsystem that may employ redundant sensor packages, the different sensor packages may be mounted in each of the side view mirror housings. This has the added benefit of providing high resolution orientation information for all of the co-located sensors.
(38) Unless otherwise stated, the foregoing alternative examples are not mutually exclusive, but may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. In addition, the provision of the examples described herein, as well as clauses phrased as such as, including and the like, should not be interpreted as limiting the subject matter of the claims to the specific examples; rather, the examples are intended to illustrate only one of many possible embodiments. Further, the same reference numbers in different drawings can identify the same or similar elements.