Method to determine distance of an object from an automated vehicle with a monocular device

09862318 ยท 2018-01-09

Assignee

Inventors

Cpc classification

International classification

Abstract

A method of determining the distance of an object from an automated vehicle based on images taken by a monocular image acquiring device. The object is recognized with an object-class by means of an image processing system. Respective position data are determined from the images using a pinhole camera model based on the object-class. Position data indicating in world coordinates the position of a reference point of the object with respect to the plane of the road is used with a scaling factor of the pinhole camera model estimated by means of a Bayes estimator using the position data as observations and under the assumption that the reference point of the object is located on the plane of the road with a predefined probability. The distance of the object from the automated vehicle is calculated from the estimated scaling factor using the pinhole camera model.

Claims

1. A method of determining the distance of an object from an automated vehicle, said method comprising: equipping an automated vehicle with a monocular image acquiring device, said device located on the automated vehicle at a predefined height above a plane of a road on which the automated vehicle travels; capturing images of an environment proximate to the vehicle, wherein the images of the environment are taken at time intervals by the device; classifying, by an image processing system, an object in the images; associating, by the image processing system, the object with an object-class; determining, by the image processing system, respective position data from the images using a pinhole camera model and based on the object-class, said position data indicative of a position of a reference point of the object with respect to the plane of the road in world coordinates; estimating a scaling factor of the pinhole camera model using a Bayes estimator on the position data as observations and under the assumption that the reference point of the object is located on the plane of the road with a predefined probability; and calculating a distance of the object from the automated vehicle based on the scaling factor using the pinhole camera model, wherein an angle of inclination is used as the position indication that a view ray adopts with respect to the plane of the road, said view ray leading from a center of the automated vehicle and projected perpendicular onto the plane of the road and from a fixed point arranged at a known distance therefrom to the reference point of the object.

2. A method in accordance with claim 1, wherein a further assumption is used as the basis for the Bayes estimator in that a width of the object is associated with the object-class and adopts a standard value associated with of the object-class with a predefined probability.

3. A method in accordance with claim 2, wherein the assumption that the reference point of the object is located on the plane of the road with a predefined probability is modeled by a continuous a priori distribution, whereas the assumption that the width of the object adopts a standard value with a predefined probability is modeled by a discrete a priori distribution.

4. A method in accordance with claim 1, wherein results of a time filtering of the position data are used as observations for the Bayes estimator, wherein mean time values over a respective plurality of successively taken images are formed for the position data.

5. A method in accordance with claim 1, wherein the scaling factor (p) of the pinhole camera is estimated by determining a modal value of a posteriori distribution of the observations.

6. A method in accordance with claim 1, wherein a normal distribution can be fixed around the value zero as an a priori distribution of the angle of inclination.

7. A method in accordance with claim 1, wherein movement of the object is tracked by a tracking process that uses a recursive state switch characterized by a Kalman filter, and that the estimated scaling factor is used as an input for the recursive state switch.

8. A method in accordance with claim 1, wherein the object is associated with one of a plurality of object-classes by a classifier, wherein a separate width range and a separate set of discrete width values are defined for each object-class for fixing the a priori distribution of the width of the object.

9. A method in accordance with claim 8, wherein the object is associated with one of the object-classes four-wheeled vehicle and two-wheeled vehicle by the classifier, wherein, an object associated with the object-class four-wheeled vehicle is associated with one of the subclasses passenger vehicle, van and truck.

10. A method in accordance with claim 8, wherein a basic truth of the classifier is used for solving the equations of the Bayes estimator.

11. A method in accordance with claim 1, wherein a smallest enclosing rectangle is determined for the object with a base located at the lower margin of the smallest enclosing rectangle being selected as the reference point of the object.

Description

BRIEF DESCRIPTION OF DRAWINGS

(1) The present invention will now be described, by way of example with reference to the accompanying drawings, in which:

(2) FIG. 1 is a schematic representation of an apparatus for recognizing and tracking an object from a automated vehicle in accordance with one embodiment;

(3) FIG. 2 shows, using a pinhole camera model, the relationship between the angle of inclination of an object point with respect to the plane of the road and the scaling factor of the pinhole camera model in accordance with one embodiment; and

(4) FIG. 3 is a diagram which shows a diagram of a method for determining the distance of an object from an automated vehicle in accordance with one embodiment.

DETAILED DESCRIPTION

(5) FIG. 1 illustrates a non-limiting example of an apparatus that includes a monocular image capturing device 11, hereafter referred to as the device 11, which may be, for example, a digital camera. The apparatus also includes an image processing system 13, preferably a computer-assisted image processing system, associated with or in communication with the image capturing device 11. The device 11 is advantageously mounted on an automated vehicle (not shown) at a predefined or known value of height h above the plane of a road 21. The device 11 is configured to take or capture images of the environment or area about the vehicle at regular time intervals. The image processing system 13 may be configured to classify and/or track objects 17 of interest in the captured images, objects such as pedestrians, other vehicles, road signs, and the like using the images received/output by the device 11. Only one object 17 is shown in FIG. 1 in schematic form for reasons of simplicity. The image processing system 13 for the object recognition, object-classification, and object tracking preferably works with at least one classifier and at least one recursive state estimator, such as is generally known.

(6) In order to determine a distance d from the automated vehicle to the object 17 using only the device 11, the object is first recognized in a typical manner by means of the image processing system 13 and is associated with an object-class. Within the framework of this process, a smallest enclosing rectangle or so-called bounding box is determined and a base 19 of the object 17 located at the lower margin of the smallest enclosing rectangle at the center is defined as the reference point for the further processing steps. The device 11 is then mathematically modeled by a pinhole camera model which is illustrated in FIG. 2.

(7) The relationship between the world coordinates x.sub.w, y.sub.w, z.sub.w of a specific scene point and the associated image coordinates x.sub.i, y.sub.i, z.sub.i is given by Eq. 1 in the pinhole camera model:

(8) ( x i y i z i 1 ) = C .Math. ( x w y w z w 1 ) , Eq . 1
where C designates a transformation matrix of the pinhole camera model. The position (ix, iy) of the imaged point within the image plane can then be calculated as follows:

(9) i x = x i z i = sx i sz i i y = y i z i = sy i sz i .

(10) This position within the image plane is the same for every s?0. That is every multiplication of the image coordinate vector by a factor s produces the same picture elements. Eq. 1 can be rewritten as shown in Eq. 2 by assuming p=1/s are the same and using auxiliary coordinates x.sub.w, y.sub.w, z.sub.w, {tilde over (x)}.sub.w, {tilde over (y)}.sub.w, {tilde over (z)}.sub.w:

(11) s .Math. C 3 ? 3 - 1 ( x i y i z i ) + C 4 - 1 = ( x w y w z w ) ? 1 p .Math. ( x ~ w y ~ w z ~ w ) + C 4 - 1 = ( x w y w z w ) , Eq . 2
where C.sup.?1 designates the inverse of the transformation matrix. The unknown factor p is called the scaling factor.

(12) In a method in accordance with the invention, this scaling factor p of the pinhole camera model is estimated using a Bayes estimator, where the assumption is used as prior knowledge that the base 19 of an object 17 recognized as a road user is normally located in the plane of the road 21.

(13) p is therefore the sought parameter in the estimation using the Bayes estimator. p=W/WR, now applies, where W is an object width constant which is normally used for tracking the object 17 by means of tracking and which is provided by a corresponding tracker and where WR is the unknown actual width of the object 17. A finite set of possible object widths WR, and thus of possible scaling factors p is now defined on the basis of the different object-classes predefined by the classifier in order to take the circumstance into account that there are frequently width values which are impossible for a specific object 17. For example, a distinction can be made between motorized two-wheeled vehicles, on the one hand, and four-wheeled vehicles, on the other hand, with the four-wheeled vehicles being treated separately from the two-wheeled vehicles.

(14) ? = { { W W min vehicle , .Math. , W W max vehicle } , Vehicletype = Four - wheeledvehicle { W W min bike , .Math. , W W max bike } , Vehicletype = Two - wheeledvehicle

(15) This produces a finite set of hypotheses for the currently matching scaling factor p which is characterized by the index i. The angle of inclination ?.sub.i.sup.t between the base 19 and the plane of the road 21 can be given by Eq. 3 at the time t for each of these hypotheses p.sub.i:

(16) ? t i = arctan ( x w t p i + C 14 z w t p i + C 34 ) Eq . 3

(17) In this respect, in accordance with FIG. 2, it is that angle which a view ray 24 adopts with respect to the plane of the road 21, said view ray leading from the origin 23 of the plane of the road 21 to the base 19 of the object 17. The origin 23 of the plane of the road 21 is produced by a perpendicular projection of the vehicle center, not shown, onto the plane of the road 21. It is located behind the origin 25 of the pinhole camera model with respect to the direction of travel.

(18) A direct estimate of the scaling factor p by means of a Bayes estimator with an observed angle a is not reliable enough to be used in a driver assistance system due to various interference influences such as unpredictable pitching of the device 11. A mean value such as the arithmetic mean of a plurality of such angle observations is therefore used to compensate the interference. The following observation is specifically defined:

(19) ? _ i := mean ( A i ) ? ? , ? = ] - ? 2 , .Math. ? 2 [ , A i = { ? i t .Math. t = 1 , .Math. , n } .
where it is expected of the angle a that it has a normal distribution about the plane of the road 21.

(20) C:={CAR,VAN,TRUCK} U {BIKES} can be defined as the finite set of object-classes, where CAR stands for passenger vehicles, VAN for vans, TRUCK for trucks and BIKES for motorcycles and optionally bicycles. A Bayes estimator is then obtained which combines the above-described angle observation with the object-classes cc?C using the basic truth classes c.sub.gt?C:

(21) P ( p i .Math. ? _ i , c c ) = P ( ? _ i , c c .Math. p i ) P ( p i ) .Math. p i ? ? P ( ? _ i , c c .Math. p i ) = def P ( ? _ i .Math. p i ) P ( c c .Math. p i ) 1 .Math. ? .Math. 1 .Math. ? .Math. .Math. p i ? ? P ( ? _ i , c c .Math. p i ) = P ( ? _ i .Math. p i ) P ( c c .Math. p i ) .Math. p i ? ? P ( ? _ i , c c .Math. p i ) .

(22) Here, P(p.sub.i|?.sub.i,c.sub.c) designates the distribution of the parameter p to be estimated in dependence on the observed angles a and vehicle classes c.sub.c. The following relationship can be given using the basic truth classes c.sub.gt k:

(23) P ( c c .Math. p i ) = .Math. c gt P ( c c , c gt .Math. p i ) = .Math. c gt P ( c c .Math. c gt , p i ) P ( c gt .Math. p i ) = .Math. c gt P ( c c .Math. c gt ) P ( c gt .Math. p i ) = .Math. c gt p ( c c .Math. c gt ) P ( p i .Math. c gt ) p ( c gt ) .Math. c gt P ( p i .Math. c gt ) p ( c gt ) .

(24) When assembling the formulas for the Bayes estimator, it was implicitly assumed that the different scaling factors p.sub.i, are evenly distributed within the finite set of possible values. It is understood that this assumption is arbitrary and that the estimation process can be improved as required by taking account of additional a priori knowledge.

(25) The scaling factor p can now be estimated with the aid of the above-given formulas using a maximum a posteriori processhere by determining the modal value of the a posteriori distribution of the observations (Eq. 4):

(26) p = arg max p i ( P ( p i .Math. ? _ i , c c ) ) = arg max p i ( P ( ? _ i .Math. p i ) .Math. c gt P ( c c .Math. c gt ) P ( p i .Math. c gt ) P ( c gt ) .Math. p i ? ? P ( ? _ i , c c .Math. p i ) .Math. c gt P ( p i .Math. c gt ) P ( c gt ) ) = arg max p i P ( ? _ i .Math. p i ) .Math. c gt P ( c c .Math. c gt ) P ( p i .Math. c gt ) P ( c gt ) .Math. c gt P ( p i .Math. c gt ) P ( c gt ) ( Eq . 4 )

(27) Provided it is ensured that P(c.sub.c|c.sub.gt)?0,P(p.sub.i|c.sub.gt)?0 and P(c.sub.gt)?0 for at least one c.sub.gt?C, a logarithmic version of formula (4) can be used (Eq. 5):

(28) 0 p = arg max p i ( P ( p i .Math. ? _ i , c c ) ) = arg max p i ( ln ( P ( ? _ i .Math. p i ) ) + ln ( .Math. c gt P ( c c .Math. c gt ) P ( p i .Math. c gt ) P ( c gt ) ) - ln ( .Math. c gt P ( p i .Math. c gt ) P ( c gt ) ) ) , ( Eq . 5 )
where P(?.sub.i|p.sub.i) is calculated in each time step using Eq. 3. The last two summands of the logarithmic formula of Eq. 5 can be calculated before the start of the process and can be stored in a look-up table since they do not depend on the time.

(29) P(c.sub.c|c.sub.gt) is derived from the class confusion matrix of the classifier and P(c.sub.c|c.sub.gt) is the a priori probability for a specific object-class. P(p.sub.i|c.sub.gt) is selected in the following manner, for example:

(30) P ( p i .Math. c gt ) = { 1 .Math. { p i .Math. W p i ? I c gt , p i ? ? } .Math. , W p i ? I c gt 0 , other ,
where I.sub.c.sub.gt respectively contains the object widths which are possible for the respective vehicle types (CAR, VAN, TRUCK or BIKE). Instead of an even distribution in the corresponding interval, a different distribution could also be used for each basic truth class cgt.

(31) After the estimation of the scaling factor p, the distance d of the object 17 of the vehicle can be calculated as follows:

(32) d = .Math. 1 p .Math. ( x ~ w y ~ w z ~ w ) + C 4 - 1 .Math. 2 .

(33) In addition, the estimated scaling factor p is used as the input for that recursive state estimator which serves for tracking the movement of the object 17 by means of a tracking process. It can in particular be a Kalman filter in this respect.

(34) A particularly robust estimate of the scaling factor and consequently a particularly reliable determination of the object distance is possible due to the fusion of the prior knowledge present in various form by means of the specific Bayes estimator, as is illustrated in FIG. 3.

(35) While this invention has been described in terms of the preferred embodiments thereof, it is not intended to be so limited, but rather only to the extent set forth in the claims that follow.