IN-VEHICLE SENSOR SYSTEM
20210387616 · 2021-12-16
Assignee
Inventors
Cpc classification
B60W30/0956
PERFORMING OPERATIONS; TRANSPORTING
B60W30/0953
PERFORMING OPERATIONS; TRANSPORTING
G01S2013/932
PHYSICS
G01S17/86
PHYSICS
B60W2520/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A system of the present disclosure includes a coarse observation sensor configured to observe a range around a vehicle, high-accuracy observation object identification means configured to identify a high-accuracy observation object that is an object detected by the coarse observation sensor in the observation range and is an object to be observed at a higher resolution, object presence area prediction means configured to predict a range of an object future presence area where the high-accuracy observation object may be present after the identification, a fine observation sensor configured to observe the range of the object future presence area at the higher resolution, and object information output means configured to output information on the high-accuracy observation object observed by the fine observation sensor.
Claims
1. An in-vehicle sensor system configured to observe a situation around a vehicle, the in-vehicle sensor system comprising: a first sensor configured to observe a predetermined range around the vehicle at a first resolution; high-accuracy observation object identification means configured to identify a high-accuracy observation object, the high-accuracy observation object being an object detected by the first sensor in the predetermined range and being an object to be observed at a second resolution, the second resolution being higher than the first resolution; object presence area prediction means configured to predict a range of an object future presence area, the object future presence area being an area where the high-accuracy observation object may be present after the identification; a second sensor configured to observe the range of the object future presence area at the second resolution; and object information output means configured to output information on the high-accuracy observation object observed by the second sensor.
2. The in-vehicle sensor system according to claim 1, wherein: the high-accuracy observation object identification means is configured to detect a position or range of a presence area of the high-accuracy observation object in the predetermined range observed by the first sensor; and the object presence area prediction means is configured to predict a position or range of the object future presence area seen from the vehicle, based on the position or range of the presence area of the high-accuracy observation object in the predetermined range.
3. The in-vehicle sensor system according to claim 2, wherein: the high-accuracy observation object identification means is further configured to detect a type of the high-accuracy observation object; and the object presence area prediction means is configured to predict the position or range of the object future presence area seen from the vehicle, based on the position or range of the presence area of the high-accuracy observation object in the predetermined range and, in addition, based on the detected type of the high-accuracy observation object.
4. The in-vehicle sensor system according to claim 2, the system further comprising vehicle motion state acquisition means configured to acquire a vehicle speed or moving distance, and/or a turning state value or turning angle, of the vehicle, wherein the object presence area prediction means is configured to predict the position or range of the object future presence area seen from the vehicle, based on the position or range of the presence area of the high-accuracy observation object in the predetermined range and, in addition, based on the vehicle speed or moving distance, and/or the turning state value or turning angle, of the vehicle.
5. The in-vehicle sensor system according to claim 3, wherein the object presence area prediction means is configured to predict the object future presence area that varies in size depending upon the type of the high-accuracy observation object.
6. The in-vehicle sensor system according to claim 2, wherein: the high-accuracy observation object identification means is further configured to detect a relative speed and/or a relative moving direction of the high-accuracy observation object seen from the vehicle; and the object presence area prediction means is configured to predict the position or range of the object future presence area seen from the vehicle, based on the position or range of the presence area of the high-accuracy observation object in the predetermined range and, in addition, based on the detected relative speed and/or detected relative moving direction of the high-accuracy observation object.
7. The in-vehicle sensor system according to claim 6, the system further comprising vehicle motion state acquisition means configured to acquire a turning state value or turning angle of the vehicle, wherein the object presence area prediction means is configured to predict the position or range of the object future presence area seen from the vehicle, based on the position or range of the presence area of the high-accuracy observation object in the predetermined range, based on the relative speed and/or relative moving direction of the high-accuracy observation object, and based on the turning state value or turning angle of the vehicle.
8. The in-vehicle sensor system according to claim 1, wherein the high-accuracy observation object identification means is configured to include detected-object threat level determination means to determine the high-accuracy observation object based on a threat level of an object, the detected-object threat level determination means being configured to determine the threat level of the object, the threat level representing a level of an impact of the object on traveling of the vehicle, the object being an object detected in the predetermined range observed by the first sensor.
9. The in-vehicle sensor system according to claim 8, wherein the high-accuracy observation object identification means is configured to select at least one object in descending order of the threat level as the high-accuracy observation object.
10. The in-vehicle sensor system according to claim 1, wherein the first and second sensors are sensors selected from a camera, a millimeter wave radar, and a rider.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] Features, advantages, and technical and industrial significance of exemplary embodiments of the disclosure will be described below with reference to the accompanying drawings, in which like signs denote like elements, and wherein:
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
DETAILED DESCRIPTION
Configuration of In-Vehicle Sensor System
[0026] With reference to
[0027] (Note) The resolution of the coarse observation sensor 14 and the fine observation sensor 16 may be spatial resolution or angular resolution. The spatial resolution represents the minimum of the distance between two points at which the points can be distinguished in the space observed by the sensor. The angular resolution represents the minimum of the angle between two points at which the points can be distinguished in the visual field observed by the sensor. A high resolution means that the distinguishable distance or angle between two points is small.
[0028] The observation control device 12, which may be implemented by a computer, may include a computer or a driving circuit that has a CPU, a ROM, a RAM, and an input/output port device that are interconnected by a standard, bidirectional common bus. The configuration and the operation of each of the components of the observation control device 12, which will be described later, may be implemented by the operation of the computer that works according to a program.
[0029] Referring to
[0030] When the high-accuracy observation object and the object future presence area are determined as described above, the information is given to the fine observation sensor 16 so that the fine observation sensor 16 can observe the object at a higher resolution in the object future presence area. The observation data obtained during this observation (such as reflected-wave signal intensity) is sent to the observation result processing unit of the observation control device 12 and, in this unit, its data format is converted to the data format in which the object can be recognized. Then, the object recognition unit detects and recognizes the object in the object future presence area using the data format in which the object can be recognized, with the result that its position, presence range, type, speed, moving direction, etc. (seen from the vehicle) are detected at the resolution of the fine observation sensor 16.
[0031] In this way, the extensive information on the situation around the vehicle, detected and recognized by the coarse observation sensor 14, and the more accurate recognition information on a high-accuracy observation object, obtained during the observation by the fine observation sensor 16, are sent to the observation result integration/output unit. From that unit, the information on the situation around the vehicle and the information on the object may be sent to the corresponding control device so that the information will be used for driving assistance control and autonomous driving control. System Operation
[0032] (1) Overview
[0033] As mentioned in “Summary of the Disclosure”, when using the observation information on the situation around a vehicle for driving assistance control or autonomous driving control, it is preferable that the information on an object in the observed range, such as the position or presence range, type, speed, and moving direction, be detected and recognized with a higher accuracy. However, the higher the accuracy of the observation, the longer the time required for the observation. Therefore, it is sometime impossible to secure sufficient time for accurately observing the whole range to be observed around the vehicle, for example, while the vehicle is traveling. On the other hand, an object such as that used for driving assistance control or autonomous driving control, for which high-accuracy information is required, is present usually in a part of the range to be observed around the vehicle. This means that, once the approximate position of an object for which high-accuracy information is desired can be recognized, it is, in some cases, sufficient for high-accuracy observation to be performed only for the object for which such high-precision information is desired. Considering this fact, the observation is performed in the system in this embodiment as described in Japanese Patent Application No. 2020-71587. That is, in consideration of the speed of the motion of the vehicle, the observation of the whole observation range around the vehicle is performed quickly at a resolution high enough to obtain the information such as the presence/absence, position or presence range, type, speed, and moving direction of the objects in the observation range, while the observation at a high resolution is performed only for an object that need be observed with high accuracy. This reduces the whole observation time and, at the same time, gives high-accuracy information suitable for driving assistance control or autonomous driving control.
[0034] However, in the observation described above, it takes a certain amount of time from the time the observation of the whole observation range around the vehicle is performed using the coarse observation sensor to recognize the objects in the observation range and to identify a high-accuracy observation object to the time the observation using the fine observation sensor is started. During this period of time, the high-accuracy observation object or the vehicle may move to another position or change the direction. For example, as schematically shown in
[0035] (2) Operation of Observation Processing
[0036] Referring to
(i) Observation of the whole observation range around the vehicle by the coarse observation sensor (step 1)
(ii) Recognition of objects in the observation range (step 2)
(iii) Determination of a high-accuracy observation object (step 3)
(iv) Prediction of the future presence area of the high-accuracy observation object (steps 4 to 6)
(v) Observation of the object future presence area by the fine observation sensor (step 7) (vi) Recognition of an object in the object future presence range (step 8)
(vii) Output of the observation result (step 9) The above processing will be described below sequentially.
[0037] (i) Observation of the Whole Observation Range Around the Vehicle by the Coarse Observation Sensor (Step 1)
[0038] As mentioned above, the observation by the coarse observation sensor may be typically performed by capturing an image by the camera in the usual manner as quickly as possible in the area to be observed around the vehicle (in front of, to the right and left of, and behind the vehicle, respectively). The resolution required in this case may be a resolution high enough to identify the presence or absence of objects in the observation range and to identify the positions or presence ranges of the objects at a certain degree of accuracy. The data obtained by the coarse observation sensor (usually brightness data or intensity data) may be generated as two-dimensional (or three-dimensional) image data by the image generation unit.
[0039] (ii) Recognition of Objects in the Observation Range (Step 2)
[0040] In the image data obtained by the image generation unit, the images of objects (such as other vehicles, roadside buildings, walls, fences, guardrails, poles, parked vehicles, pedestrians (pedestrians, bicycles), road ends, road markings (white lines, yellow lines), and traffic lights) are recognized, and the positions or presence ranges of those objects are detected (at the resolution of the coarse observation sensor). In addition, as will be described later, the type of an object described above or the moving speed and moving direction (seen from the vehicle) of an object may be detected in this step. A plurality of objects may be detected in the observation range. An object may be recognized and detected using any image recognition technique.
[0041] (iii) Determination of a High-Accuracy Observation Object (Step 3)
[0042] An object that is included in the objects recognized in the observation range of the coarse observation sensor in step 2 and is to be observed at a particularly high accuracy may be determined by any method according to the use purpose of the observation result. For example, when the observation result is used for driving assistance control for collision avoidance or is used for autonomous driving control, an object that will have a large impact on later driving may be selected as a high-accuracy observation object by referring to the distance from the vehicle to the object, the moving direction of the object, and the type of the object. In one mode, one possible method is that a threat level is given to each of the objects recognized in the observation range. In giving this threat level, it is assumed that the threat of an object to the traveling of the vehicle increases (the need for attention increases) as the distance to the vehicle is shorter, as the moving direction is more likely to intersect the traveling path of the vehicle, or as the moving speed is higher; alternatively, it is assumed that the threat increases in the order of a stationary object, another vehicle, and a person. After that, the threat levels of each object are totaled, the objects are ranked according to the threat level, and the high-accuracy observation objects to be observed preferentially are determined in the descending order of the rank. A plurality of objects may be selected as a high-accuracy observation object in a certain observation range.
[0043] (iv) Prediction of the Future Presence Area of the High-Accuracy Observation Object (Steps 4 to 6)
[0044] After a high-accuracy observation object is determined as described above, the expected position of the high-accuracy observation object in the future, more specifically, the expected position when observation is performed by the fine observation sensor (that is, the object future presence area) is predicted. The prediction of the object future presence area may be achieved in various ways, for example, as follows.
[0045] (a) Prediction of the Object Future Presence Area by Referring to the Type of the High-Accuracy Observation Object
[0046] In one mode, the object future presence area may be predicted according to the type of the high-accuracy observation object. In short, the movable range of the high-accuracy observation object from the observation position is calculated in this case based on the moving speed predicted according to the type of the high-accuracy observation object, and the calculated movable range is predicted as the object future presence area. In addition, the predicted position of the vehicle at the time when the observation is performed by the fine observation sensor may be calculated using the vehicle motion information, the object future presence area may be corrected to the position seen from the predicted position of the vehicle and, in addition, the angular range seen from the vehicle for observing the object future presence area may be determined.
[0047] More specifically, first, referring to
X.sub.ob=(r.sub.o cos θ.sub.o,r.sub.o sin θ.sub.o) (1)
[0048] Now, let Δt be the length of time from the coarse observation sensor observation time t1 to the fine observation sensor observation time t2 when the maximum moving speed v.sub.max assumed for the high-accuracy observation object ob is used. Then, the expected position of the high-accuracy observation object ob after an elapse of time Δt is on a circle with a radius of v.sub.max Δt and the center at the position X.sub.ob or is the inside of the circle as shown in
[0049] Person: 0 km/h (It is thought that a person hardly moves)
[0050] Bicycle: 20 km/h
[0051] Automobile: 100 km/h
[0052] Motorcycle: 80 km/h
Thus, the range to which the high-accuracy observation object ob will move in the future is predicted to be on or inside the following circle (step 4):
{(Y−r.sub.o cos θ.sub.o).sup.2+(X−r.sub.o sin θ.sub.o).sup.2}.sup.1/2=v.sub.maxΔt (2)
[0053] That is, the object future presence range that varies in size depending on the type of the high-accuracy observation object may be predicted. When the motion of the vehicle 10 is not taken into consideration, this range to which the high-accuracy observation object ob will move in the future may be used as the object future presence area.
[0054] In addition, when the motion information on the vehicle 10 is acquired (step 5) and, based on the acquired motion information on the vehicle 10, the range to which the high-accuracy observation object ob, which has been predicted as described above, will move in the future is corrected, the accuracy of the object future presence area is improved. More specifically, when the vehicle 10 is moving, for example, at the vehicle speed V.sub.c, initial yaw rate γ.sub.o, and yaw angle acceleration a.sub.c as shown in
Ψ.sub.f=γ.sub.oΔt+a.sub.cΔt.sup.2/2 (3a)
x.sub.vf=∫V.sub.c.Math.cos Ψf(t)dt (3b)
y.sub.vf=∫V.sub.c.Math.sin Ψf(t)dt (3c)
(The integration interval is [0, Δt].)
[0055] Here, the center position X.sub.obf (x.sub.obf, y.sub.obf) of the high-accuracy observation object ob seen from the vehicle 10 at time t2 is expressed as follows by converting the coordinates from the X-Y coordinates to the Xf-Yf coordinates using expressions (3a) to (3c).
[0056] Thus, the object future presence area of the high-accuracy observation object ob seen from the vehicle 10 after time t.sub.o is predicted to be on the following circle:
{(Y.sub.f−y.sub.obf).sup.2+(X.sub.f−x.sub.obf).sup.2}.sup.1/2=v.sub.maxΔt (5)
[0057] That is, when the motion of the vehicle is taken into consideration, the object future presence area of the high-accuracy observation object ob seen from the vehicle 10 moves from the circle W, which is the presence area at the time of observation by the coarse observation sensor, to the circle W_f, as shown in
[0058] Thus, as shown in
Therefore, the angular range ps to be observed by the fine observation sensor 16 can be determined by calculating the maximum value φ.sub.max and the minimum value φ.sub.min of the angular coordinates φ in expression (6) in the range indicated by the range (6a).
[0059] If the moving distance x.sub.vf, y.sub.vf or the turning angle Ψ.sub.f of the vehicle during time Δt can be obtained in real time from the GPS information etc. when predicting the object future presence area as described above, those obtained values may be used in place of the calculations in expressions (3a) to (3c). In addition, if the turning angle Ψ.sub.f during time Δt is negligible, the coordinate rotation calculation in expression (4) need not be performed. If v.sub.max=0, the object future presence area may be predicted as a range of the object size d.
[0060] (b) Prediction of the Object Future Presence Area Using the Speed of a High-Accuracy Observation Object
[0061] In another mode, when the speed and the moving direction of an object seen from the vehicle can be detected during the observation by the coarse observation sensor (when the relative speed in the x direction and the relative speed in the y direction of the object can be detected separately (step 2 in
[0062] More specifically, when the object ob recognized in the observation range cs of the coarse observation sensor 14 is determined as a high-accuracy observation object as shown in
[0063] Thus, at observation time t2 of the fine observation sensor, the high-accuracy observation object ob will move to the position ob_f shown in
r.sub.f=(x.sub.obf.sup.2+Y.sub.obf.sup.2).sup.1/2 (8a)
φ.sub.f=tan.sup.−1(y.sub.obf/x.sub.obf) (8b)
[0064] Therefore, the angular range ps to be observed by the fine observation sensor 16 is determined as a range between the following angles:
φ.sub.min=φ.sub.f−tan.sup.1(d/(2r.sub.f)) (9a)
φ.sub.max=φ.sub.f+tan.sup.−1(d/(2r.sup.f)) (9b)
[0065] If the turning angle Ψ.sub.f of the vehicle during time Δt can be obtained in real time from the GPS information etc. when predicting the object future presence area as described above, those obtained values may be used in place of the calculations in expression (3a). In addition, if the turning angle Ψ.sub.f during time Δt is negligible, the coordinate rotation calculation in expression (7) need not be performed.
[0066] (v) Observation of the Object Future Presence Area by the Fine Observation Sensor (Step 7)
[0067] When the object future presence area is predicted and the angular range ps in which the predicted object future presence area can be viewed is determined as described above, the observation is performed by the fine observation sensor at an angle in the angular range ps. The resolution required in this step may be high enough for use as the information acceptable or satisfactory for driving assistance control or autonomous driving control.
[0068] (vi) Recognition of an Object in the Object Future Presence Range (Step 8)
[0069] The data obtained by the fine observation sensor (usually, intensity data or brightness data) may be sent to the observation result processing unit for converting it into a data format that allows the object to be recognized. After that, the object recognition unit recognizes the high-accuracy observation object based on the data obtained by the observation result processing unit. More specifically, the position or presence range and the type are identified, and the moving speed and the moving direction are detected, at an accuracy higher than that of the object obtained by the coarse observation sensor.
[0070] (vii) Output of the Observation Result (Step 9)
[0071] The information on the object recognized/detected through the observation by the coarse observation sensor and the fine observation sensor as described above may be integrated, as appropriate, and output to the corresponding control devices for use in driving assistance control and autonomous driving control.
[0072] Thus, as described in the above example, the system in this embodiment is an in-vehicle sensor system for observing the area around the vehicle using the coarse observation sensor and the fine observation sensor. This in-vehicle sensor system predicts the position, or the presence area, of an object to be observed by the fine observation sensor in consideration of the motion of the object to be detected by the sensor or the motion of the vehicle itself and performs observation by the fine observation sensor at the predicted position or in the predicted presence area. Therefore, it is expected that the observation of a high-accuracy observation object will be performed more reliably. The information on the area around the vehicle, acquired by the system in this embodiment, may be advantageously used in driving assistance control and autonomous driving control of the vehicle.
[0073] Although the above description has been made in connection with the embodiments of the present disclosure, many changes and modifications can be easily made by those skilled in the art. It is apparent that the present disclosure is not limited to the embodiments exemplified above but may be applied to various devices without departing from the concept of the present disclosure.