Patent classifications
G01S2013/93271
Motion Classification Using Low-Level Detections
Techniques and apparatuses are described that implement motion classification using low-level detections. In particular, a radar system identifies fused detections associated with an object and determines whether the fused detections indicate that the object is moving. If it is determined to be moving or moving perpendicular to the host vehicle, a current motion counter or perpendicular motion counter is incremented, respectively. A current motion flag and/or a perpendicular motion flag are set as true if the current motion counter or the perpendicular motion counter has a value greater than a threshold value, respectively. In response to setting either flag as true, the radar system increments a historical motion counter as true. The host vehicle is then operated based on the current motion flag, the perpendicular motion flag, and the historical motion counter. In this way, the radar system introduces hysteresis to improve the reliability and stability of motion classification.
CORRECTION OF PHASE DEVIATIONS IN THE ANALOG FRONTEND OF RADAR SYSTEMS
According to a further example implementation, the method comprises measuring magnitude response information relating to an analog baseband signal processing chain of a reception channel of a radar system, determining—based on the measured magnitude response information—at least one value which characterizes at least one frequency limit of the first baseband signal processing chain, and determining a phase response for the baseband signal processing chain based on the at least one value and a model of the baseband signal processing chain. The method also comprises digitizing an output signal from the baseband signal processing chain and digitally processing the digitized output signal, wherein phase equalizing is carried out based on the determined phase response during normal radar operation of the radar system.
Traffic radar system with patrol vehicle speed detection
A traffic radar system comprises a first radar transceiver, a second radar transceiver, a speed determining element, and a processing element. The first radar transceiver transmits and receives radar beams and generates a first electronic signal corresponding to the received radar beam. The second radar transceiver transmits and receives radar beams and generates a second electronic signal corresponding to the received radar beam. The speed determining element determines and outputs a speed of the patrol vehicle. The processing element is configured to receive a plurality of digital data samples derived from the first or second electronic signals, receive the speed of the patrol vehicle, process the digital data samples to determine a relative speed of at least one target vehicle in the front zone or the rear zone, and convert the relative speed of the target vehicle to an absolute speed using the speed of the patrol vehicle.
System and method for calibrating vehicular radar sensing system
A method for calibrating a vehicular radar sensing system includes disposing two spaced apart calibrating radars at respective transmitting locations that are spaced from a vehicle calibration location at an end of line portion of a vehicle assembly line, and moving a vehicle along the vehicle assembly line, the vehicle including an electronic control unit (ECU) and a vehicular radar operable to sense exterior of the vehicle. Signals are transmitted via the first and second calibrating radars at the transmitting locations and, with the vehicle at the vehicle calibration location, the plurality of radar receivers of the vehicular radar receive the transmitted signals transmitted by the first and second calibrating radars, and the vehicular radar generates an output that is processed at the ECU. Responsive to processing at the ECU of the output of the vehicular radar, misalignment of the vehicular radar at the vehicle is determined.
Extrinsic calibration of multiple vehicle sensors using combined target detectable by multiple vehicle sensors
Sensors coupled to a vehicle are calibrated, optionally using a dynamic scene with sensor targets around a motorized turntable that rotates the vehicle to different orientations. One vehicle sensor captures a representation of one feature of a sensor target, while another vehicle sensor captures a representation of a different feature of the sensor target, the two features of the sensor target having known relative positioning on the target. The vehicle generates a transformation that maps the captured representations of the two features to positions around the vehicle based on the known relative positioning of the two features on the target.
Interleaving Radar Range and Doppler Processing
Described are techniques for interleaving range and Doppler radar processing. A data cube is memory accessed differently, from one look period to the next, which allows Doppler processing for a current look period to happen in parallel with range processing for a next look period. Range processing for a first look period writes to rows of the data cube; Doppler processing reads from and empties its columns. But before Doppler processing can finish, a second look period begins. Rather than re-writing to the rows, range processing in the second look period writes to the columns just emptied by the ongoing Doppler processing. Doppler processing for the first look period is allowed to finish by executing during processing idle times in the second period, e.g., in-between chirps. With better processor utilization, Doppler processing is afforded more time to do its complex operations, while keeping look periods as short as possible.
Sensor array for an autonomously operated utility vehicle and method for surround-view image acquisition
A sensor apparatus for an autonomously operated commercial vehicle to allow panoramic capture of surroundings of the commercial vehicle, including: radar units mountable in front corner areas of the vehicle; downwardly directed cameras having a fisheye objective, mountable on front upper corner areas of the vehicle; at least one rearwardly directed sensor mounted on a section of the vehicle to allow rearward image capture; and an evaluation module to evaluate image data from the radar units, the downwardly directed cameras and the at least one rearwardly directed sensor to achieve the panoramic capture of the surroundings of the vehicle; in which the radar units and the at least one rearwardly directed sensor capture all points in a surrounding area of the vehicle, and wherein the downwardly directed cameras capture all points in the surrounding area of the vehicle. Also described are a related commercial vehicle, method and computer readable medium.
Model for excluding vehicle from sensor field of view
The technology relates to developing a highly accurate understanding of a vehicle's sensor fields of view in relation to the vehicle itself. A training phase is employed to gather sensor data in various situations and scenarios, and a modeling phase takes such information and identifies self-returns and other signals that should either be excluded from analysis during real-time driving or accounted for to avoid false positives. The result is a sensor field of view model for a particular vehicle, which can be extended to other similar makes and models of that vehicle. This approach enables a vehicle to determine when sensor data is of the vehicle or something else. As a result, the detailed modeling allowing the on-board computing system to make driving decisions and take other actions based on accurate sensor information.
METHOD FOR DETECTING AN OBSTACLE ON A ROUTE
A computer-implemented method for detecting an obstacle on a route ahead of a first vehicle. In the method, information on a second vehicle driving ahead on the route is recorded in the first vehicle by at least one sensor of the first vehicle. In the first vehicle, depending on the recorded information, a computer detects an avoidance maneuver of the second vehicle due to an obstacle or detects that the second vehicle has driven over an obstacle. An obstacle is detected on the route depending on the detected avoidance maneuver or the detection that the vehicle has driven over an obstacle. A measure for protecting the vehicle and/or the obstacle is initiated depending on the detected obstacle.