G01S2013/932

Systems and methods for estimating vehicle speed based on radar

Systems, methods, and other embodiments relate to determining the speed of a vehicle. In one embodiment, a method includes receiving a first frame of data generated by a first sensor of a vehicle, the first frame of data including a first set of angular positions associated with a first set of objects in the environment. The method includes receiving a second frame of data generated by a second sensor of the vehicle, the second frame of data including a second set of angular positions associated with a second set of objects in the environment. The method includes generating a speed estimate for the vehicle in relation to the first set of objects and the second set of objects based at least in part on the first set of angular positions of the first frame of data and the second set of angular positions of the second frame of data.

Secure vehicle communications architecture for improved blind spot and driving distance detection

Disclosed are techniques for improving an advanced driver-assistance system (ADAS) using a secure channel area. In one embodiment, a method is disclosed comprising establishing a secure channel area extending from at least one side of a first vehicle; detecting a presence of a second vehicle in the secure channel area; establishing a secure connection with the second vehicle upon detecting the presence; exchanging messages between the first vehicle and the second vehicle, the messages including a position and speed of a sending vehicle; taking control of a position and speed of the first vehicle based on the contents of the messages; and releasing control of the position and speed of the first vehicle upon detecting that the secure connection was released.

APPARATUS AND METHOD FOR MONITORING SURROUNDING ENVIRONMENT OF VEHICLE
20230027766 · 2023-01-26 · ·

An apparatus for monitoring the surrounding environment of a vehicle includes: a plurality of detection sensors to detect an object outside the vehicle according to a frame at a predefined period; and a controller to extract a stationary object from among objects detected by the detection sensors, to map the stationary object to a grid map, to calculate an occupancy probability parameter indicative of a probability that the stationary object will be located on a grid of the grid map, and to monitor the surrounding environment of the vehicle based on the occupancy probability parameter. The controller maps the stationary object to the grid map while updating the grid map by changing an index of each grid constituting the grid map according to behavior information of the vehicle.

SYSTEM AND METHOD FOR CLASSIFYING A TYPE OF INTERACTION BETWEEN A HUMAN USER AND A MOBILE COMMUNICATION DEVICE IN A VOLUME BASED ON SENSOR FUSION

A system and method for classifying a type of interaction between a human user and a mobile communication device within a defined volume, based on multiple sensors. The method may include: determining a position of the mobile communication device relative to a frame of reference of the defined volume, based on: angle of arrival, time of flight, or received intensity of radio frequency (RF) signals transmitted by the mobile communication device and received by a phone location unit located within the defined volume configured to wirelessly communicate with the mobile communication device; obtaining at least one sensor measurement related to the mobile communication device from various non-RF sensors; repeating the obtaining, to yield a time series of sensor readings; and using a computer processor to classify the type of interaction into one of many predefined types of interactions, based on the position and the time series of sensor readings.

Device and method for determining the initial direction of movement of an object in the detection range of a motor vehicle radar sensor

An estimated initial direction of movement of a newly-detected object is to be determined. The actual previous directions of movement and positions of previously-detected objects are determined and stored. When a newly-detected object is newly detected at a certain position, then the actual previous direction of movement of one of the previously-detected objects at that position is used as a basis to determine the estimated initial direction of movement of the newly-detected object at that position.

Vehicle radar sensing system with enhanced angle resolution

A vehicular sensing system includes at least one radar sensor disposed at a vehicle and having a field of sensing exterior of the vehicle. The radar sensor includes multiple transmitting antennas and multiple receiving antennas. The transmitting antennas transmit signals and the receiving antennas receive the signals reflected off objects. Multiple scans of radar data are received at an electronic control unit (ECU) and processed at a processor of the ECU. The ECU detects presence of a plurality of objects exterior the equipped vehicle and within the field of sensing of the at least one radar sensor. The ECU, responsive at least in part to processing at the processor of the received multiple scans of captured radar data and received vehicle motion estimation, tracks objects detected in the received multiple scans over two or more scans.

Technologies for acting based on object tracking
11703593 · 2023-07-18 · ·

This disclosure enables various technologies involving various actions based on tracking an object via a plurality of distance sensors, without synchronizing carrier waves of the distance sensors or without employing a PLL technique on the distance sensors.

Multi-modal sensor data association architecture

A machine-learning architecture may be trained to determine point cloud data associated with different types of sensors with an object detected in an image and/or generate a three-dimensional region of interest (ROI) associated with the object. In some examples, the point cloud data may be associated with sensors such as, for example, a lidar device, radar device, etc.

Traffic radar system with patrol vehicle speed detection
11703602 · 2023-07-18 · ·

A traffic radar system comprises a first radar transceiver, a second radar transceiver, a speed determining element, and a processing element. The first radar transceiver transmits and receives radar beams and generates a first electronic signal corresponding to the received radar beam. The second radar transceiver transmits and receives radar beams and generates a second electronic signal corresponding to the received radar beam. The speed determining element determines and outputs a speed of the patrol vehicle. The processing element is configured to receive a plurality of digital data samples derived from the first or second electronic signals, receive the speed of the patrol vehicle, process the digital data samples to determine a relative speed of at least one target vehicle in the front zone or the rear zone, and convert the relative speed of the target vehicle to an absolute speed using the speed of the patrol vehicle.

SENSOR RECOGNITION INTEGRATION DEVICE
20230221432 · 2023-07-13 · ·

Provided is a sensor recognition integration device capable of reducing the load of integration processing so as to satisfy the minimum necessary accuracy required for vehicle travel control, and capable of improving processing performance of an ECU and suppressing an increase in cost. A sensor recognition integration device B006 that integrates a plurality of pieces of object information related to an object around an own vehicle detected by a plurality of external recognition sensors includes: a prediction update unit 100 that generates predicted object information obtained by predicting an action of the object; an association unit 101 that calculates a relationship between the predicted object information and the plurality of pieces of object information; an integration processing mode determination unit 102 that switches an integration processing mode for determining a method of integrating the plurality of pieces of object information on the basis of a positional relationship between a specific region (for example, a boundary portion) in an overlapping region of detection regions of the plurality of external recognition sensors and the predicted object information; and an integration target information generation unit 104 that integrates the plurality of pieces of object information associated with the predicted object information on the basis of the integration processing mode.