Vehicle collision detection and driver notification system
20230093042 · 2023-03-23
Inventors
- Eduard Alarcon Cot (Barcelona, ES)
- Alvaro Ferrer Rizo (Madrid, ES)
- Eugeni Llagostera Saltor (Barcelona, ES)
- Cristina Castillo Cerda (Cerdanyola del Valles, ES)
Cpc classification
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
G08G1/165
PHYSICS
G06T7/277
PHYSICS
B60W30/0953
PERFORMING OPERATIONS; TRANSPORTING
B62J50/21
PERFORMING OPERATIONS; TRANSPORTING
G06V20/58
PHYSICS
G08G1/166
PHYSICS
B62J45/41
PERFORMING OPERATIONS; TRANSPORTING
B60W2300/36
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W30/095
PERFORMING OPERATIONS; TRANSPORTING
B60W50/14
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A vehicle collision avoidance and driver notification system includes an object detection unit configured to detect environmental obstacles and a collision detection unit for assessing risk of collision. Depending on risk assessment, a collision avoidance unit gives feedback to the driver or directly interacts with the vehicle engine.
Claims
1. A collision avoidance device for a lightweight vehicle operated by a driver, comprising: an object detection unit configured to detect obstacles following a trajectory relative to the unit; a collision avoidance unit communicatively coupled to the object detection unit; and a warning notification unit communicatively coupled to the collision avoidance unit and configured to convey an audible or visual warning to the driver; wherein the object detection unit calculates the distance to an environmental obstacle and the collision avoidance unit determines an index estimating the likelihood of collision with that obstacle; and wherein when the index meets a first predefined threshold, the warning notification unit notifies the driver.
2. The collision avoidance device for a lightweight vehicle as in claim 1, wherein the collision avoidance unit further determines the index by determining if the obstacle is following a trajectory that could lead to collision with the lightweight vehicle.
3. The collision avoidance device of claim 2, wherein the warning notification conveys a warning to the driver comprising an audible indication.
4. The collision avoidance device of claim 3, wherein the object detection unit includes a combination of a stereo imaging pickup device and a distance-measuring device comprising a radar sensor.
5. The collision avoidance device of claims 2, wherein the object detection unit is configured to operate with lightweight algorithms that require 8 GB RAM or less and wherein radar hardware requires no more than a 5 volt 3 ampere power supply.
6. The collision avoidance device of claim 4, wherein the collision avoidance unit consists of low computational cost elements.
7. The collision avoidance device of claim 6, wherein the collision avoidance unit is responsible for the synchronization of the object detection unit data and uses a Kalman filter algorithm to track and identify the detected objects and wherein when the collision avoidance unit determines that the detected obstacle is following a dangerous trajectory with respect to the lightweight vehicle and the distance to the obstacle meets a predefined threshold, the warning notification unit notifies the driver.
8. The collision avoidance device of claim 6, wherein the collision avoidance unit incorporates at least one user specific metric in the trajectory estimate.
9. The collision avoidance device of claim 2 wherein the engine speed unit will downregulate as a function of the likelihood of collision index.
Description
SUMMARY OF FIGURES
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
DETAILED DESCRIPTION
[0021] The invention is implemented in the context of a lightweight vehicle operated by a driver. Environment detection devices, including a stereo camera and a radar, transmit information from the environment to a device that processes this data and assesses risk to the driver. An object detection unit processes camera and radar data and uses this data to assess risks in the environment and give feedback to the driver.
[0022] Visual data is collected using a camera. Collected data is processed using a light-weight deep learning object detection system. Examples of currently available solutions that could be incorporated into such a system include Tiny-YOLO (You Only Look Once), MobileNet SSDlite (Single Shot MultiBox Detection), or TensorFlow-Lite. First the model is trained with a light-weight structure. In an embodiment, a TensorFlow open-source machine learning library is used. The model is further optimized to a new format, such as OpenVINO Intermediate Representation format capable of doing the inference task. In an embodiment, this task is achieved with a vision processing unit (VPU) integrated on the vision hardware at 25-30 frames per second (fps). A currently available example of such a VPU is the Intel Myriad X.
[0023] In an embodiment, vision hardware with a 12 megapixel (MP) Integrated 4K color camera is used. Alternatively, a 1 MP stereo synchronized camera with integrated global shutter is used. Such a camera provides stereo depth without burdening the host with computational requirements.
[0024] Object detection data points are also collected from a low power radar sensor. The detection algorithm contains clustering of points in order to reduce noise and improve obstacle distance accuracy from the sensor. In an embodiment, the latency between call and response is close to 200 milliseconds. For example, a Texas Instruments AWR1843 Radar Sensor is used to provide Short-and Medium-Range Radar with about a 150 meter maximum range.
[0025] The collision avoidance unit is responsible for the synchronization of both data acquisition and the use of different algorithms for merging and synchronizing vision and radar data. The unit also tracks and identifies detected objects and decides when to issue warnings. In an embodiment, a Kalman filter is used for tracking and identifying moving objects.
[0026] In an embodiment, detections are merged by comparing the tracked object's position with the camera and the radar. In a particular frame, a radar detected object and a vision detected object are defined. These definitions allow a more accurate distance calculation for the detected and classified object, including a more accurate radar distance.
[0027] To define a warning situation, the collision avoidance unit determines whether a detected object is following a dangerous trajectory with respect to the unit. If the distance to the obstacle is lower than a predefined threshold, the warning notification unit notifies the driver of the dangerous situation. In an embodiment, the collision avoidance unit contains a lightweight computer with a 1.5 GHz 64-bit quad core processor with 8 GB RAM. Power requirements are approximately 5V-3 A. The warnings are displayed on an integrated display in the vehicle, by means of a distinctive sound to attract the driver's attention, or by a combination of methods. For example, a high-pitched sound could be used to attract the driver's attention. Two-way danger notification, visible and audible, is therefore provided for greater driver safety.
[0028] Object detection software is tuned to detect objects that drivers of lightweight vehicles are likely to encounter, such as pedestrians, pets, cars, trucks, buses, bicycles, motorbikes, traffic lights, street signs, and other objects likely to be encountered while in urban traffic.
[0029] The visual and audible warning system is equipped with lightweight hardware and software capable of executing correctly while deployed on a lightweight vehicle in an urban traffic environment.
[0030]
[0031]
[0032]
[0033] Collision avoidance determines whether to activate speaker 314 and indicator 316 in several ways. In an embodiment, the determination depends on a threat index that is at least a function of the object's distance. For example, the threat index can include distance thresholds such that objects within a first, nearer distance trigger a loud audible warning while objects at a second, farther distance trigger a visual indication such as an illuminated arrow in the direction of the obstacle. The index can be further enhanced by including object trajectory into the calculation of the index. In this embodiment, warnings can be prioritized for nearby objects with a trajectory in the path of the lightweight vehicle. On the other hand, nearby stationary objects outside the pay of the lightweight vehicle, such as trees and traffic lights generate lower priority warnings or are ignored altogether.
[0034]
[0035] At step 412, a first decision is made whether the index generated at step 410 exceeds a first predefined threshold. If so, the driver is notified with a first indication at step 414. In an embodiment, the first indication is made visually by way of an LED indicator.
[0036] At step 416, a second decision is made whether the index generated at step 410 exceeds a second predefined threshold. If so, the driver is notified with a second indication at step 418. In an embodiment, the second indication is made audibly by way of a speaker that gives one or more directions to the driver to help avoid the obstacle.
[0037] At step 420, a third decision is made whether the index generated at step 410 exceeds a third predetermined threshold. If so, a third indication is communicated to the vehicle engine actuation unit. In an embodiment, the third indication instructs the vehicle engine actuation unit to adjust the engine speed of the lightweight vehicle downward, thereby reducing the lightweight vehicle's velocity.
[0038]
[0039]
[0040]
[0041]
[0042] For some objects, such as unclassified objects 810 and 812, the risk of collision may be determined to be relatively low by a combination of analytics. For example, objects outside the current path of the lightweight vehicle but relatively close may be excluded as risks if they are classified as immovable, such as traffic signs or trees. On the other hand, such close objects may be relatively large risks if they are classified as cars or motorbikes in motion. Trajectory calculations allow non-moving parked vehicles to be distinguished from moving vehicles and thereby further refine the accuracy of the threat index.
[0043]