Process for Examining a Loss of Media of a Motor Vehicle as Well as Motor Vehicle and System for Implementing Such a Process
20170364756 ยท 2017-12-21
Inventors
Cpc classification
G07C5/08
PHYSICS
B60R2300/802
PERFORMING OPERATIONS; TRANSPORTING
B60R2300/302
PERFORMING OPERATIONS; TRANSPORTING
G08G1/087
PHYSICS
G08G1/096758
PHYSICS
B60R1/00
PERFORMING OPERATIONS; TRANSPORTING
G06V20/56
PHYSICS
G08G1/205
PHYSICS
International classification
G07C5/08
PHYSICS
Abstract
A process is provided for examining a loss of media of a motor vehicle. By use of a first camera arranged on a motor vehicle, a driving route in the area in front of a motor vehicle that is driving in the travel direction and is to be monitored is scanned. The route behind the motor vehicle to be monitored is scanned by way of a second camera arranged on a motor vehicle. When scanning the driving route, a forward image and a rearward image are acquired. The two images are compared for detecting a medium lost by the motor vehicle to be monitored. The corresponding medium can be classified. Appropriate actions can be automatically carried out corresponding to the classification of the medium.
Claims
1. A process for examining a loss of media of a motor vehicle, the process comprising the acts of: scanning, via a first scanning device, a route in an area in front of the motor vehicle that driving in a travel direction and is to be monitored; scanning, via a second scanning device, the route behind the motor vehicle to be monitored, wherein at least a forward image and a rearward image are acquired via the scanning of the route; comparing the forward and the rearward images in order to detect the loss of media by the motor vehicle being monitored.
2. The process according to claim 1, further comprising the act of: for comparing the forward and the rearward images, aligning the forward and rearward images, via a characteristic analysis, to coincide with one another.
3. The process according to claim 2, wherein before the forward and rearward images are compared with one another, the forward and rearward images are mutually adapted with respect to one or more of: resolution, contrast, color representation, and/or perspective.
4. The process according to claim 1, wherein a differential image is produced for the comparison, wherein objects in the differential image and/or corresponding objects in the rearward image are analyzed.
5. The process according to claim 4, wherein when analyzing the objects, a shape and/or spectral analysis is carried out.
6. The process according to claim 4, wherein several rearward images, which each show the object to be analyzed, are analyzed jointly via triangulation, so that a three-dimensional description of a surface of the object is produced.
7. The process according to claim 1, wherein a time and/or location stamp, respectively, is added to each individual image, which stamp indicates when the respective image was acquired and/or the location where the respective image was acquired.
8. The process according to claim 1, wherein the forward and rearward images are acquired by at least two scanning devices, which are arranged at the same motor vehicle.
9. The process according to claim 1, wherein the forward and rearward images are acquired by at least two scanning devices, which are arranged at different motor vehicles, the images being transmitted to a server, on which they are compared.
10. The process according to claim 1, wherein one or more of the following scanning devices for acquiring the forward and/or rearward image are used: an optical camera, an infrared camera, a laser scanner, or a radar sensor.
11. The process according to claim 1, wherein one of the actions carried out during the detection of a predefined medium is: (i) that a vehicle driver of the monitored vehicle is informed, (ii) that a public authority is informed, (iii) that traffic participants are informed by car-to-car information, light signals and/or honking signals.
12. The process according to claim 1, wherein before the comparison of the forward and rearward images, in each case, a rearward image or a forward image consisting of a quantity of several rear or forward images are assigned to a forward image or a rearward image.
13. A motor vehicle, comprising at least one front scanning device and one rear scanning device for scanning of a driving route, and a control device which is configured to implement the process according to claim 1.
14. A system for implementing the process according to claim 1, comprising: at least two scanning devices, which are arranged on a motor vehicle, wherein the motor vehicle has a data connection to a server situated outside the motor vehicle, which server compares the forward and rearward images.
15. The system according to claim 14, wherein the two scanning devices are arranged on two different motor vehicles.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0049]
[0050]
[0051]
[0052]
DETAILED DESCRIPTION OF THE DRAWINGS
[0053]
[0054] The central control device 6 is connected with a display device 10, by way of which messages can be transmitted to the vehicle driver.
[0055] By use of the cameras 2-5, the motor vehicle 1 can scan a driving route, can produce images of the route and can evaluate the images in the central control device 6 in order to examine a loss of media of a motor vehicle. By use of the optical cameras 2, 4, images can be taken of the driving route in the visible wavelength range of the light. By use of the infrared cameras 3, 5, infrared images are taken of the driving route, which images show the thermal condition of the surface of the driving route. When a motor vehicle loses oil, for example, as a result of the fact that the temperature of the oil is higher than the ambient temperature, this oil can be detected rapidly and reliably.
[0056]
[0057] The central control device 6 is connected with a radio interface 13 to which an antenna 14 is connected. By way of the antenna, a radio data connection to a data network (WAN: Wide Area Network) 15 and particularly, the Internet, can be established. A server 16 is connected to the data network 15. By means of their cameras 11, 12, the individual motor vehicles scan a driving route and transmit the corresponding images by way of the data network 15 to a server 16. The images can be analyzed at the server 16, as explained in detail below.
[0058] The motor vehicles 1 are equipped with one display device 10 respectively, which, by way of the radio interface 13 receive and display messages from the server 16, in order to thereby inform the respective vehicle driver of the motor vehicle. The vehicles 1 may further have a front-horizontal camera 17, in order to be able to acquire a motor vehicle driving ahead when determining a loss of media of the motor vehicle, in order to, when determining a loss of media of the vehicle driving ahead, be able to acquire identification information of the motor vehicle, particularly the license plate and transmit it to the server 16 by way of the data network 15.
[0059] In the following, the basic process principle will be explained by way of
[0060] The process starts in Step S1.
[0061] In the traveling direction, in front of a motor vehicle to be examined in the traveling direction, a forward image data stream is produced by the cameras 2, 3 and 11, 12 respectively, which comprises a row of successive forward images. The individual images are taken at predefined time intervals or local clearances, in which case, the vehicle velocity provided by the central vehicle control 9 can be taken into account. The faster the vehicle is driving, the shorter the time intervals, between which a forward image is taken. All forward images are provided by the central control device 6 with a time stamp and/or location stamp, which indicates at which point in time and/or in which location the corresponding images of the driving route had been taken. The time information is provided by the clock 7, and the location information is provided by the navigation system 8.
[0062] A rearward image data stream is produced in the same manner as the forward image data stream, wherein the cameras 4, 5 and 11, 12 respectively are used, and the driving route is scanned behind the motor vehicle to be examined. The images of the rearward image data stream are provided in the same manner with at time stamp and/or a location stamp.
[0063] In the first embodiment (
[0064] During this evaluation or analysis, an image of the forward image data stream and an image of the rearward image data stream are assigned to one another (Step S3), this assignment taking place by means of the time stamp and/or location stamp. If the respective images contain a location stamp, the corresponding images of the same location or of a location that is as similar as possible are assigned to one another. When the images contain no location stamp but only a time stamp, the images can be mutually assigned by the point in time at which they had been acquired, while taking into account the driving velocity of the motor vehicle or of the traffic, so that those images of the driving route are assigned to one another which had been taken at the same location or at a location that is as similar as possible. If the speed of the traffic is used for the assignment of a pair consisting of a forward and a rearward image, it will be useful, in the case of the second embodiment, to transmit the velocity of the motor vehicle existing in each case when an image is taken, by way of the data network 15 to the server 16. Because the forward and the rearward image are taken by different motor vehicles, which may change their velocity before they each have driven over the same location of a driving route, it may be useful to average or to integrate the velocity by way of the time interval between the taking of the forward image and the rearward image.
[0065] When assigning a forward and a rearward image according to Step S3, a characteristics analysis of the images may also be carried out and specific characteristics may be extracted. The images in which most characteristics coincide will then be assigned to one another.
[0066] If a pair of forward and rearward images is assigned, in a Step 4, the images are mutually adapted with respect to resolution, contrast, color representation and/or perspective. In this case, it is mainly useful to adapt at least the perspectives to one another, so that the images can be mutually superimposed and the pixels arranged above one another in each case represent the same location of the driving route. Since, as a rule, by means of their perspectives, the cameras include different inclination angles with respect to the surface of the driving route, such a correction of the perspective is often necessary.
[0067] The above-explained characteristics analysis, in terms of the adaptation and superimposition of the images, can also be used such that the forward and the rearward image are mutually adjusted, so that the special characteristics are situated precisely above one another.
[0068] The mutually assigned forward and rearward images are compared in Step S5. Here, the comparison takes place in that a differential image is created of the two images. However, other processes may also be used for the comparison of images.
[0069] Because the forward and the rearward image can often not be adapted to one another in an absolutely precise fashion, the differential image contains many small speckles. These can be ignored. Only areas which have a defined minimum size will be considered objects in the differential image. If the forward image and the rearward image are thermal images, objects are recognized by means of a predefined minimal temperature difference.
[0070] In Step S6, it is examined whether the differential image contains at least one object. If it is determined in Step S6 that no object is present, the process sequence will change over to Step S7. In Step 7, it is examined whether additional images of the image data streams are present. If this is so, the process sequence will change over to Step S3. Should there be no more additional images, the process will be terminated with Step S8.
[0071] If, in contrast, it is determined in Step 6 that an object is contained in the differential image, the process sequence will change over to Step S9. In Step S9, the object or objects is/are subjected to an optical analysis. The optical analysis mainly takes into account the shape and/or color (=spectral analysis) of the objects. The objects can be analyzed in the differential image or in the rearward image, in which case, when the rearward image is used, those areas are read out as objects which had been considered an object in the differential image. When infrared cameras are used, the optical analysis can also be combined in connection with an analysis of the temperature of the objects. Within the scope of the invention, it is also contemplated to use a scanning device which permits a three-dimensional scanning of the objects. Such a scanning device is, for example, a laser scanner, a running-time camera or a stereo camera. When such a scanning device is used, the three-dimensional contour can also be taken into account during the analysis. It is also contemplated to determine the three-dimensional contour of an object by way of several successive images by means of triangulation and/or characteristics analysis. In this case, the individual images should each show the object from a slightly different position or perspective. So that such a three-dimensional analysis becomes possible, the relative position of the camera should be determined very precisely in the case of the individual takes. Because the movement of a modern motor vehicle is often acquired very precisely, this can be done. Thus, in the case of modern motor vehicles, often the rotational speeds of all four wheels connected with the surface are monitored and acquired independently of one another, so that the movement of the motor vehicle and thereby of the camera is acquired in a very exact manner.
[0072] The objects are classified by way of this analysis. In this case, mainly the following objects or media are classified: [0073] oils [0074] coolant [0075] fuel [0076] vehicle parts (exhaust, underbody coating, exterior components) [0077] trash (cigarettes, cups, paper, foils, etc.)
[0078] As a function of the classified objects, predefined actions are carried out in Step S10. If it is determined, for example, that the object is oil, the driver of the monitored vehicle is informed of the loss of oil; a public authority, such as the police or the Road Traffic Department is informed, and/or, by way of car-to-car communication, light signals and/or honking, other traffic participants are informed of the dangerous situation.
[0079] If the monitored object is gasoline, the driver of the monitored vehicle receives a danger warning because an explosive gasoline/air mixture may be forming in his vehicle. He then should either immediately park the car or drive to the nearest repair shop.
[0080] If the examination of the loss of medium is carried out by cameras which are not arranged on the motor vehicle to be examined, by way of the motor vehicle which follows the motor vehicle to be examined in traffic, by means of the front-horizontal camera 17, the motor vehicle to be examined is acquired and an image for identifying the motor vehicle is transmitted by way of the data network 15 to the server 16 or an authorized authority, for example, the police, which can identify the motor vehicle driver by way of the license plate and can inform him correspondingly.
[0081] In the following, several embodiments of the invention will be explained by means of
[0082] The first embodiment illustrated in
[0083] At a point in time t1, the motor vehicle 1 is in location x0 and is losing oil (t1>t0).
[0084] At a point in time t2, the optical rear camera 4 acquires the driving route 18 in location x0 and produces a corresponding rearward image of the driving route 18 from location x0. In contrast to the forward image 19, the rearward image 20 shows oil slicks 21.
[0085] The two images 19 and 20 are analyzed and evaluated by way of the method explained in
[0086] The embodiment according to
[0087] By way of the forward stereo video camera 22, the driving route is acquired at the point in time t0 in location x0, and a forward image 19 of the driving route 18 is produced. At the point in time t1 (t1>t0), the motor vehicle 1 is in location x0 and is losing oil. At the point in time t2, the motor vehicle 1 is situated with its rear in location x0 of the driving route and, by way of the rearward stereo video camera 23, acquires the driving route 18 in location x0, and thereby generates a rearward image 20. The rearward image 20 contains oil slicks 21. These are again analyzed and evaluated by the process explained above by
[0088] The third embodiment according to
[0089] The forward image 19 is transmitted by way of the data network 15 to the server 16.
[0090] At the point in time t1 (t1>t0), the first vehicle 1/1 is in location x0 and is losing oil.
[0091] A second motor vehicle 1/2, in turn, has an optical front camera 2, which is oriented diagonally toward the front with its viewing direction. At a point in time t2 (t2>t1), the optical front camera of the second vehicle 1/2 acquires a rearward image 20. The rearward image 20 is also transmitted by way of the data network 15 to the central server 16. The central server 16 evaluates the thus obtained images corresponding to the process explained above by means of
[0092] The fourth embodiment (
[0093] By way of the optical front camera, the driving route 18 is optically scanned. By way of the acquired image, it can be recognized that no oil contamination is present. The first motor vehicle 1/1 is losing oil at the point in time t1. The corresponding oil slicks are acquired by the second motor vehicle 1/2 by the infrared camera 3, in which case, a rearward thermal image 20 is produced, which shows the warm oil slick.
[0094] In the fifth embodiment (
[0095] The motor vehicle 1/1 is followed in traffic by the motor vehicle 1/2 which, at the point in time t1 (t1>t0), is situated in location x0 and is losing oil there. A motor vehicle 1/3 is following the motor vehicle 1/2 and, at the point in time t2, scans the driving route by an optical front camera 2 in location x0 and produces a rearward image 20. The rearward image 20 is, in turn, provided with a time and location stamp and is transmitted by way of the data network 15 to the central server 16. At the central server 16, the forward image 19 and the rearward image 20, which shows the oil slick 21, are analyzed and evaluated according to the process explained above by way of
[0096] The sixth embodiment illustrated in
[0097] The above-explained embodiments show that cameras may be provided at different vehicles in order to be able to detect a loss of media. It is even possible to use different types of cameras or scanning devices in a joint process for examining a loss of media.
[0098] The above-explained embodiments show how a loss of oil can be detected. Likewise, by use of the process according to the invention, the loss of other media can be acquired and evaluated.
[0099] If the media are solid mechanical parts, an object recognition can also be carried out by a differential image. As scanning devices, radar sensors may be used, particularly radar sensors which radiate radar waves of a different wavelength.
REFERENCE NUMBERS
[0100] 1 Motor vehicle [0101] 2 Optical front camera [0102] 3 Infrared front camera [0103] 4Optical rear camera [0104] 5 Infrared rear camera [0105] 6 Central control device [0106] 7 Clock [0107] 8 Navigation system [0108] 9 Central vehicle control [0109] 10 Display device [0110] 11 Optical camera [0111] 12 Infrared camera [0112] 13 Radio interface [0113] 14 Antenna [0114] 15 Data network [0115] 16 Server [0116] 17 Front horizontal camera [0117] 18 Driving route [0118] 19 Forward image [0119] 20 Rearward image [0120] 21 Oil slick [0121] 22 Forward stereo video camera [0122] 23 Rearward stereo video camera
[0123] The foregoing disclosure has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof.