DRIVER ASSISTANCE SYSTEM HAVING REAR-VIEW CAMERA AND CROSS-TRAFFIC SENSOR SYSTEM WITH SIMULTANEOUS VIEW
20200148110 ยท 2020-05-14
Assignee
Inventors
- James Hockridge Critchley (Lake Orion, MI, US)
- Peter Lamprecht (Shelby Township, MI, US)
- Tony Moussa (Rochester Hills, MI, US)
Cpc classification
B60K2360/179
PERFORMING OPERATIONS; TRANSPORTING
B60R2300/8066
PERFORMING OPERATIONS; TRANSPORTING
B60R2300/301
PERFORMING OPERATIONS; TRANSPORTING
G01S17/87
PHYSICS
G01S17/86
PHYSICS
G01S15/86
PHYSICS
B60R2300/305
PERFORMING OPERATIONS; TRANSPORTING
G01S13/87
PHYSICS
B60K35/28
PERFORMING OPERATIONS; TRANSPORTING
B60R1/00
PERFORMING OPERATIONS; TRANSPORTING
B60R2300/602
PERFORMING OPERATIONS; TRANSPORTING
B60K35/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60R1/00
PERFORMING OPERATIONS; TRANSPORTING
H04N7/18
ELECTRICITY
G06T19/00
PHYSICS
G01S13/86
PHYSICS
Abstract
A driver assistance system for a vehicle includes a rear-view camera constructed and arranged to obtain an image of an area behind a rear of the vehicle. A non-camera sensor system is constructed and arranged to obtain data regarding dynamic or static environmental features at opposing sides of the vehicle. An electronic control unit is electrically connected with the rear-view camera and with the sensor system. The electronic control unit is constructed and arranged to create an image from the data received from the sensor system. A display system is controlled by the electronic control unit and is constructed and arranged to display simultaneously to a driver of the vehicle at a single location, the image obtained by the rear-view camera and the image created by the electronic control system.
Claims
1. A driver assistance system for a vehicle comprising: a rear view camera constructed and arranged to obtain an image of an area behind a rear of the vehicle, a non-camera sensor system constructed and arranged to obtain data regarding dynamic or static environmental features at opposing sides of the vehicle, an electronic control unit electrically connected with the rear view camera and with the sensor system, the electronic control unit being constructed and arranged to create an image from the data received from the sensor system, and a display system controlled by the electronic control unit and constructed and arranged to display simultaneously to a driver of the vehicle at a single location, the image obtained by the rear-view camera and the image created by the electronic control system.
2. The system of claim 1, the electronic control unit includes a processor circuit constructed and arranged to create the image from the data received from the sensor system.
3. The system of claim 2, wherein processor circuit is constructed and arranged to create a virtual 3D image from the data received from the sensor system.
4. The system of claim 1, wherein the sensor system comprises at least first and second sensors disposed on opposing sides of the vehicle.
5. The system of claim 4, wherein the sensors are radar sensors.
6. The system of claim 4, wherein the sensors are lidar sensors.
7. The system of claim 4, wherein the sensors are ultrasonic sensors.
8. The system of claim 1, wherein the display system includes a display screen on console of the vehicle.
9. The system of claim 1, wherein the display system includes a display screen defined as part of a rear-view mirror of the vehicle.
10. The system of claim 1, wherein the display system is constructed and arranged to project the images onto a surface of the vehicle.
11. The system of claim 8, wherein the display screen is a touch activated screen constructed and arranged to enable a user of the vehicle to dynamically change a viewpoint, perspective and/or field of view of the image created from the data received from the sensor system.
12. A method of monitoring environmental features at opposing sides and at a rear of a vehicle, the method comprising: obtaining, with a camera, a first image of an area behind a rear of the vehicle, obtaining, without a camera, data regarding dynamic or static environmental features at opposing sides of the vehicle, creating a second image from the obtained data, and displaying simultaneously to a driver of the vehicle, the first and second images at a single location.
13. The method of claim 12, wherein the step of obtaining data includes using first and second sensors disposed on opposing sides of the vehicle to obtain the data.
14. The method of claim 13, wherein the sensors are radar sensors.
15. The method of claim 13, wherein the sensors are lidar sensors.
16. The method of claim 13, wherein the sensors are ultrasonic sensors.
17. The method of claim 14, wherein the step of creating the second image is performed by a processor circuit, with the processor circuit interpreting an intensity of a return radar signal as a height of an object in the second image.
18. The method of claim 12, wherein the step of displaying the first and second images includes displaying the images on a display screen in the vehicle.
19. The method of claim 12, wherein the step of displaying the first and second images includes projecting the images onto a surface of the vehicle.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] The invention will be better understood from the following detailed description of the preferred embodiments thereof, taken in conjunction with the accompanying drawings, wherein like reference numerals refer to like parts, in which:
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS
[0015] With reference to
[0016] The system 10 includes a cross-traffic sensor system including at least sensors 16 and 18 mounted to opposing sides of the vehicle 10 and constructed and arranged to obtain data regarding dynamic or static environmental features at the sides of the vehicle 10. The sensors 16 and 18 are shown mounted to a rear of the vehicle but can be mounted anywhere on opposing sides of the vehicle 10. More than two sensors can be provided. The sensors 16, 18 are preferably conventional radar, lidar, ultrasonic, or other similar non-camera sensors. A virtual image of the data obtained by the sensors 16, 18 can be created and viewed by engineers as part of the sensor/function development process, but this virtual image is not presented to the end user in conventional driver assistance systems. The inventors realized, however, that this image data can be useful the driver assistance system 10. An example of this data as a virtual 3D image is shown generally indicated at 20 in
[0017] As shown in
[0018] The system 10 further includes a display system 28 that is controlled by the ECU 22. With reference to
[0019] Displaying the additional sensor system data (via image 20) adjacent to or integrated with the rear-view camera image 15 at a single location greatly increases the driver's situational awareness during the time when the driver is focused on the back-up image by the driver being able to view a representation of cross-traffic obstacles as detected by the sensor system.
[0020] The display screen 30 can be a touch activated screen to enable the driver to change the viewpoint, perspective and/or field of view of the image 20 dynamically to focus on an identified high priority element of the environment, such as a dynamic approaching target or a static wall.
[0021] Instead of providing the display screen 30 in the console 29, the display screen can be incorporated in the rear view mirror of the vehicle, or the images 15 and 20 can be shown via a heads-up display system or can be projected onto any surface of the vehicle 10.
[0022] Due to the simultaneous display of images 15 and 20 at a common location, the driver can simultaneously monitor traffic object which are both directly behind and crossing the vehicle 10 without the use of an expensive 360 degree camera system (e.g. surround view). In contrast to the surround view camera system, the incusing of the sensor data from non-camera sensors 16, 18 in the view presented to the driver allows increased range and robustness to lighting and weather conditions.
[0023] The operations and algorithms described herein can be implemented as executable code within the micro-controller or ECU 22 having the processor circuit 24 as described, or stored on a standalone computer or machine readable non-transitory tangible storage medium that are completed based on execution of the code by a processor circuit implemented using one or more integrated circuits. Example implementations of the disclosed circuits include hardware logic that is implemented in a logic array such as a programmable logic array (PLA), a field programmable gate array (FPGA), or by mask programming of integrated circuits such as an application-specific integrated circuit (ASIC). Any of these circuits also can be implemented using a software-based executable resource that is executed by a corresponding internal processor circuit such as a micro-processor circuit (not shown) and implemented using one or more integrated circuits, where execution of executable code stored in an internal memory circuit causes the integrated circuit(s) implementing the processor circuit to store application state variables in processor memory, creating an executable application resource (e.g., an application instance) that performs the operations of the circuit as described herein. Hence, use of the term circuit in this specification refers to both a hardware-based circuit implemented using one or more integrated circuits and that includes logic for performing the described operations, or a software-based circuit that includes a processor circuit (implemented using one or more integrated circuits), the processor circuit including a reserved portion of processor memory for storage of application state data and application variables that are modified by execution of the executable code by a processor circuit. The memory circuit 26 can be implemented, for example, using a non-volatile memory such as a programmable read only memory (PROM) or an EPROM, and/or a volatile memory such as a DRAM, etc.
[0024] The foregoing preferred embodiments have been shown and described for the purposes of illustrating the structural and functional principles of the present invention, as well as illustrating the methods of employing the preferred embodiments and are subject to change without departing from such principles. Therefore, this invention includes all modifications encompassed within the spirit of the following claims.