G01S7/2955

RADAR SYSTEMS FOR DETERMINING VEHICLE SPEED OVER GROUND

A radar module determines a two-dimensional velocity vector of a heavy-duty vehicle with respect to a ground plane supporting the vehicle. The system has a radar transceiver arranged to transmit and to receive a radar signal, via an antenna array, wherein the antenna array is configured to emit the radar signal in a first direction and in a second direction different from the first direction. The radar module has a processing device to detect first and second radar signal components of the received radar signal based on their respective angle of arrival, AoA, where the first radar signal component has an AoA corresponding to the first direction and the second radar signal component has an AoA corresponding to the second direction. The processing device determines the two-dimensional velocity vector of the heavy-duty vehicle based on respective Doppler frequencies of the first and second radar signal components.

Path providing device and path providing method thereof
11679781 · 2023-06-20 · ·

A path providing device for providing a route to a vehicle includes a first communication module configured to receive a high-definition (HD) map information from an external server, a second communication module configured to receive external information generated by an external device located within a predetermined range from the vehicle, and a processor configured to generate forward path information for guiding the vehicle based on the HD map and provide the forward path information to at least one of electric components provided in the vehicle. The processor is configured to generate dynamic information related to an object to be sensed by the at least one of the electric components based on the external information and to match the dynamic information to the forward path information.

Method of Determining an Uncertainty Estimate of an Estimated Velocity
20220373572 · 2022-11-24 ·

A method of determining an uncertainty estimate of an estimated velocity of an object includes, determining the uncertainty with respect to a first estimated coefficient and a second estimated coefficient of the velocity profile equation of the object. The first estimated coefficient being assigned to a first spatial dimension of the estimated velocity and the second estimated coefficient being assigned to a second spatial dimension of the estimated velocity. The velocity profile equation represents the estimated velocity in dependence of the first estimated coefficient and the second estimated coefficient. The method also includes determining the uncertainty with respect to an angular velocity of the object, a first coordinate of the object in the second spatial dimension, and a second coordinate of the object in the first spatial dimension.

SYSTEM AND TECHNIQUES FOR CLIPPING SONAR IMAGE DATA
20220057499 · 2022-02-24 ·

Technologies for processing imaging data are disclosed, such as sonar images. A computing device obtains a three-dimensional (3D) volumetric view of a space, such as an underwater space. This data includes multiple voxels including a value characterizing a 3D point in the space. The computing device divides this data into slices representing a cross-section of the 3D volumetric view. The computing device clips one or more voxels in these slices based on a weighting function.

Covariance Matrix Technique for Error Reduction

A method for filtering spatial error from a measurement vector is provided to correct for roll, pitch and yaw angular motion. The method includes the following operations: Establish an unstabilized body reference frame. Convert the measurement vector to an unstabilized state vector x.sub.U in the unstabilized body reference frame. Establish a stabilized East-North-Up (ENU) reference frame. Calculate an unstabilized pre-transform covariance matrix M.sub.U from position variance of the body reference frame. Measure roll, pitch and yaw in the body reference frame as respective angle values (r, p, w). Calculate a transform matrix T between the body reference frame and the ENU reference frame. Calculate a stabilized data vector x.sub.S=Tx.sub.U from the transform matrix and the unstabilized state vector. Calculate a measured angle error sensitivity matrix M.sub.A from the angle values. Calculate a tri-diagonal angle error component matrix M.sub.E with square values of angle variance of the body reference frame. Calculate a total error covariance matrix P.sub.S=M.sub.A M.sub.E M.sub.A.sup.T+T M.sub.U T.sup.T. Calculate a Kalman gain matrix for current time k+1 as K(k+1)=P(k+1|k)H.sup.T[H P(k+1|k)H.sup.T+P.sub.S].sup.−1, where P(k+1|k) is predicted gain covariance matrix from previous time k to the current time (k+1), and H is measurement Jacobian. Finally, apply the Kalman gain matrix to a predicted state estimate for correcting the measurement vector x.sub.m.

Radar imaging system and related techniques
09746554 · 2017-08-29 · ·

A radar imaging system and technique is described in which the imaging system generates an image and transforms the image into world coordinates taking into account host position and heading. Once in world coordinates, successive radar images can be summed (integrated, averaged) to produce an integrated image having a resolution which is improved compared with an originally generated image.

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, PROGRAM, MOBILE-OBJECT CONTROL APPARATUS, AND MOBILE OBJECT

The present technology relates to an information processing apparatus, an information processing method, a program, a mobile-object control apparatus, and a mobile object that make it possible to improve the accuracy in recognizing a target object.

An information processing apparatus includes a geometric transformation section that transforms at least one of a captured image or a sensor image to match coordinate systems of the captured image and the sensor image, the captured image being obtained by an image sensor, the sensor image indicating a sensing result of a sensor of which a sensing range at least partially overlaps a sensing range of the image sensor; and an object recognition section that performs processing of recognizing a target object on the basis of the captured image and sensor image of which the coordinate systems have been matched to each other. The present technology is applicable to, for example, a system used to recognize a target object around a vehicle.

PROCESS FOR MONITORING VEHICLES BY A PLURALITY OF SENSORS

The invention relates to a process for monitoring vehicles on a road by a system comprising at least one radar sensor and a second sensor different from the radar sensor, wherein the second remote sensor is a time-of-flight optical sensor or optical image sensor, the process comprising a temporal readjustment and a spatial matching in order to obtain a set of measurement points each assigned to first characteristics derived from the radar data and second characteristics derived from the optical data, the determination of the radar vehicle trackings and of the optical vehicle trackings, a comparison of similarity between the radar vehicle trackings and the optical vehicle trackings, the elimination of the radar vehicle trackings for which no optical vehicle tracking is similar, the process comprising monitoring a parameter derived from first characteristics of a retained radar vehicle tracking.

Localization using particle filtering and image registration of radar against elevation datasets

A system for localization includes a radar, a database, a simulator, a registrar, and a filter. The radar is positioned at a disposed location requiring localization. The radar generates a radar image scanning a proximity around the disposed location. The database stores features of a landmass. The simulator generates synthesized images of the features that the radar is predicted to generate from corresponding viewpoints. The registrar calculates respective correlation indicators between the radar image and each synthesized image. The filter sets a pose estimate of the disposed location to an average of those viewpoints from which correspond the synthesized images having the best or better ones of the correlation indicators.

AXIAL DEVIATION ESTIMATING DEVICE
20220229168 · 2022-07-21 ·

An axial misalignment estimation apparatus, mounted in a moving body, acquires reflection point information for each of reflection points detected by a radar apparatus, extracts, from the reflection points, at least a single road-surface reflection point detected by reflection on a road surface, based on the reflection point information. Based on the reflection point information, the axial misalignment estimation apparatus identifies, for each road-surface reflection point, apparatus system coordinates based on coordinate axes of the radar apparatus, and estimates an axial misalignment angle and a height of the radar apparatus using a relational expression established between at least two unknown parameters and at least two elements included in the apparatus system coordinates of the road-surface reflection point. The unknown parameters include the axial misalignment angle being a misalignment angle of a coordinate axis of the radar apparatus around a target axis, and a mounting height of the radar apparatus.