Device and Method for Determining Objects Around a Vehicle
20230037900 · 2023-02-09
Inventors
- Mirko Meuter (Erkrath, DE)
- Christian Nunn (Hückeswagen, DE)
- Jan Siegemund (Kö, DE)
- Jittu Kurian (Velbert, DE)
- Alessandro Cennamo (Wuppertal, DE)
- Marco Braun (Düsseldorf, DE)
- Dominic Spata (Witten, DE)
Cpc classification
International classification
Abstract
The present disclosure is directed at systems and methods for determining objects around a vehicle. In aspects, a system includes a sensor unit having at least one radar sensor arranged and configured to obtain radar image data of external surroundings to determine objects around a vehicle. The system further includes a processing unit adapted to process the radar image data to generate a top view image of the external surroundings of the vehicle. The top view image is configured to be displayed on a display unit and useful to indicate a relative position of the vehicle with respect to determined objects.
Claims
1. A system comprising: a sensor unit, the sensor unit comprising at least one radar sensor arranged and configured to obtain radar image data of external surroundings of a vehicle to determine objects around the vehicle; and a processing unit, the processing unit configured to process the radar image data to generate a top view image of the external surroundings of the vehicle, the top view image configured to be displayed on a display unit and useful to indicate a relative position of the vehicle with respect to determined objects.
2. The system according to claim 1, wherein the sensor unit further comprises one or more additional sensors further arranged and configured to obtain additional sensor data of the external surroundings of the vehicle, and wherein the processing unit is further configured to process the other sensor data to visually enhance the top view image to be displayed on the display unit.
3. The system according to claim 2, wherein the processing unit is further configured to process at least one of the radar image data or the additional sensor data using at least one of a machine-learning algorithm or an image enhancement algorithm to visually enhance the top view image to be displayed on the display unit.
4. The system according to claim 1, wherein the at least radar sensor is further arranged and configured to obtain doppler data of the external surroundings of the vehicle, and wherein the processing unit is further configured to process the doppler data to visually enhance the top view image to be displayed on the display unit.
5. The system according to claim 1, wherein the processing unit is further configured to process radar image data obtained from multiple scans to generate the top view image to be displayed on the display unit.
6. The system according to claim 1, wherein the processing unit is further configured to process the radar image data to determine and highlight on the top view image at least one of an unoccupied space or one or more objects.
7. The system according to claim 6, wherein the processing unit is further configured to process the radar image data to determine dimensions of the unoccupied space.
8. The system according to claim 7, wherein the processing unit is further configured to, based on the dimensions of the unoccupied space, determine if the unoccupied space is sufficiently large to accommodate the vehicle.
9. The system according to claim 8, further comprising an autonomous driving unit communicatively coupled to the processing unit.
10. The system according to claim 9, wherein the autonomous driving unit is configured to control a movement of the vehicle based on input received from the processing unit.
11. The system according to claim 10, wherein the autonomous driving unit is further configured to, based on input received from the processing unit and a determination that the unoccupied space is sufficiently large to accommodate the vehicle, position the vehicle in the unoccupied space.
12. A method comprising: obtaining radar image data of external surroundings of a vehicle to determine objects around the vehicle; and processing the radar image data to generate a top view image of the external surroundings of the vehicle, the top view image configured to be displayed on a display unit and useful to indicate a relative position of the vehicle with respect to determined objects.
13. The method according to claim 12, further comprising: obtaining additional sensor data of the external surroundings of the vehicle; and processing the additional sensor data to visually enhance the top view image to be displayed on the display unit.
14. The method according to claim 13, further comprising: processing at least one of the radar image data and the additional sensor data using at least one of a machine-learning algorithm or an image enhancement algorithm to visually enhance the top view image to be displayed on the display unit.
15. The method according to claim 12, further comprising: obtaining doppler data of the external surroundings of the vehicle; and processing the doppler data to visually enhance the top view image to be displayed on the display unit.
16. The method according to claim 12, further comprising: processing radar image data obtained from multiple scans to generate the top view image to be displayed on the display unit.
17. The method according to claim 12, further comprising: processing the radar image data to determine and highlight on the top view image at least one of an unoccupied space or one or more objects.
18. A non-transitory computer-readable storage medium storing one or more programs comprising instructions, which when executed by a processor, cause to the processor to perform operations including: obtaining radar image data of external surroundings of a vehicle to determine objects around the vehicle; and processing the radar image data to generate a top view image of the external surroundings of the vehicle, the top view image configured to be displayed on a display unit and useful to indicate a relative position of the vehicle with respect to determined objects.
19. The non-transitory computer-readable storage medium according to claim 18, wherein the instructions, when executed, configure the processor to perform additional operations including: obtaining additional sensor data of the external surroundings of the vehicle; and processing the additional sensor data to visually enhance the top view image to be displayed on the display unit.
20. The non-transitory computer-readable storage medium according to claim 19, wherein the instructions, when executed, configure the processor to perform additional operations including: processing at least one of the radar image data and the additional sensor data using at least one of a machine-learning algorithm or an image enhancement algorithm to visually enhance the top view image to be displayed on the display unit.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0070] Example embodiments and functions of the present disclosure are described herein in conjunction with the following drawings, showing schematically:
[0071]
[0072]
[0073]
[0074] In the figures, like numerals refer to same or similar features.
DETAILED DESCRIPTION
[0075]
[0076] The device 10 comprises a radar sensor 12 arranged and adapted to obtain radar image data of the external surroundings 100 of the vehicle 1 to determine objects around the vehicle. The radar sensor may be part of a sensor unit (not shown). The device 10 further comprises a processing unit 14 adapted to process the radar image data to generate a top view image of the external surroundings 100 of the vehicle 1 visible to the human eye in top view of the vehicle and indicating the relative position of the vehicle with respect to the determined objects. The top view image is displayed on a display unit (not shown) by the processing unit 14.
[0077] The device 10 further comprises a sensor 16 arranged and adapted to obtain other sensor data of the external surroundings 100 of the vehicle 1, wherein the processing unit 14 is further adapted to process the other sensor data to visually enhance the image. The sensor 16 may be part of the sensor unit (not shown).
[0078] The radar sensor 12 is arranged and adapted to obtain doppler data of the external surroundings 100 of the vehicle 1 and the processing unit 14 is further adapted to process the doppler data to visually enhance the image.
[0079] The processing unit 14 is further adapted to process other data to visually enhance the image.
[0080] The processing unit 14 is further adapted to process radar image data from multiple scans to generate the image.
[0081] The processing unit 14 is further adapted to use machine learning and an image enhancement algorithm to visually enhance the image.
[0082] The processing unit 14 is further adapted to process the radar image data to determine and highlight an unoccupied space 200 in the external surroundings 100 of the vehicle 1 in the image.
[0083] The processing unit 14 is further adapted to process the radar image data to determine if the unoccupied space 200 is sufficiently large enough for accommodating the vehicle 1.
[0084] The processing unit 14 is further adapted to process the radar image data to determine and highlight an object 300 in the external surroundings 100 of the vehicle 1 in the image.
[0085] The device 10 further comprises an autonomous driving unit 18 that is adapted to control a movement of the vehicle 1 based on input of the processing unit 14.
[0086] For simplicity reasons, the device 10, the radar sensor 12, the other sensor 16, the processing unit 14 and the autonomous driving unit 18 are shown in
[0087]
[0088] In a first step 1100, radar image data of the external surroundings of the vehicle are obtained to determine objects around the vehicle.
[0089] In a next step 1200, the radar image data are processed to generate a top view image of the external surroundings of the indicating the relative position of the vehicle with respect to the determined objects.
[0090] In a further step 1300, other sensor data of the external surroundings of the vehicle are obtained.
[0091] In a next step 1400, the other sensor data are processed to visually enhance the top view image.
[0092] In another step 1500, doppler data of the external surroundings of the vehicle are obtained.
[0093] In a next step 1600 the doppler data are processed to visually enhance the top view image.
[0094] In a next step 1700, other data are processed to visually enhance the top view image.
[0095] In a further step 1800, radar image data are processed from multiple scans to obtain the top view image.
[0096] In another step 1900, machine-learning is used to visually enhance the top view image.
[0097] In a further step 2000, an image enhancement algorithm is used to visually enhance the top view image.
[0098] In a further step 2100, the radar image data is processed to determine and highlight an unoccupied space in the external surroundings of the vehicle in the image.
[0099] In another step 2200, the radar image data are processed to determine and highlight an object in the external surroundings of the vehicle in the top view image.
[0100] In a last step 2300, a movement of the vehicle is controlled based on input of the processing unit.
[0101] In a further step (not shown) the generated top view image is displayed on a display unit.
[0102] All steps can be processed in a different order. The method 1000 can repeat itself continuously.
[0103]
[0104] As can be seen from the top view image 5000, the vehicle 1 is centered in the picture. In the external surroundings 100 of the vehicle 1, an unoccupied space 200 and objects 300 are visually highlighted.
[0105] This top view image 5000 can be displayed to a driver of the vehicle on a display unit of a portable device and/or of the vehicle.
Conclusion
[0106] Although implementations for determining objects around a vehicle have been described in language specific to certain features and/or methods, the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations for determining objects around a vehicle.
[0107] Unless context dictates otherwise, use herein of the word “or” may be considered use of an “inclusive or,” or a term that permits inclusion or application of one or more items that are linked by the word “or” (e.g., a phrase “A or B” may be interpreted as permitting just “A,” as permitting just “B,” or as permitting both “A” and “B”). Also, as used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. For instance, “at least one of a, b, or c” can cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiples of the same element (e.g., a-a, a-a-a, a-a-b, a-a-c, a-b-b, a-c-c, b-b, b-b-b, b-b-c, c-c, and c-c-c, or any other ordering of a, b, and c). Further, items represented in the accompanying figures and terms discussed herein may be indicative of one or more items or terms, and thus reference may be made interchangeably to single or plural forms of the items and terms in this written description.
LIST OF REFERENCE CHARACTERS FOR THE ELEMENTS IN THE DRAWINGS
[0108] The following is a list of the certain items in the drawings, in numerical order. Items not listed in the list may nonetheless be part of a given embodiment. For better legibility of the text, a given reference character may be recited near some, but not all, recitations of the referenced item in the text. The same reference number may be used with reference to different examples or different instances of a given item. [0109] 1 vehicle [0110] 10 radar system [0111] 12 electronic processing device [0112] 14 driving direction [0113] 16 traffic space [0114] 18 object [0115] 100 primary radar signal [0116] 200 secondary radar signal [0117] 300 first radar antenna assembly [0118] 1000 second radar antenna assembly [0119] 1100 crash beam [0120] 1200 front surface [0121] 1300 curved surface portion [0122] 1400 feed horn [0123] 1500 reflector [0124] 1600 passage [0125] 1700 crash beam [0126] 1800 front surface [0127] 1900 curved surface portion [0128] 2000 feed horn [0129] 2100 reflector [0130] 2200 passage [0131] 2300 feed horn [0132] 5000 reflector