G01S2013/9319

SENSOR ASSEMBLY WITH LIDAR FOR AUTONOMOUS VEHICLES

A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.

Methods and systems for tracking a mover's lane over time

Systems and methods for monitoring the lane of an object in an environment of an autonomous vehicle are disclosed. The methods include receiving sensor data corresponding to the object, and assigning an instantaneous probability to each of a plurality of lanes based on the sensor data as a measure of likelihood that the object is in that lane at a current time. The methods also include generating a transition matrix for each of the plurality of lanes that encode one or more probabilities that the object transitioned to that lane from another lane in the environment or from that lane to another lane in the environment at the current time. The methods then include determining an assigned probability associated with each of the plurality of lanes based on the instantaneous probability and the transition matrix as a measure of likelihood of the object occupying that lane at the current time.

SYSTEMS AND METHODS FOR IMPROVING ACCURACY OF PASSENGER PICK-UP LOCATION FOR AUTONOMOUS VEHICLES
20230044015 · 2023-02-09 · ·

Systems and methods for determining precise pick-up locations for passengers who have requested autonomous vehicle rides. In particular, systems and methods are provided for using wireless signals to determine user location. In some examples, wireless ranging technology, such as Ultra Wide Band (UWB), is used to determine the user location. Wireless transceivers are used to determine a mobile device's range, and range information from multiple transceivers is used to determine the mobile's device's position. In some examples, triangulation is used to determine user location, such as triangulation between one or more wireless transceivers and the mobile device. In various examples, wireless transceivers are installed on autonomous vehicles, and in some examples, wireless transceivers are installed in various static locations (e.g., on buildings, lamp posts, or other structures.

SUPER RESOLUTION SYSTEM, DEVICE AND METHODS
20230039572 · 2023-02-09 · ·

A super resolution system, the system including: at least one antenna; transmission electronics; receiving electronics; and receiving computing electronics, where the transmission electronics are structured to transmit a first electromagnetic wave having an Orbital Angular Momentum wave-front thru the antenna towards a target, where the transmission electronics are structured to transmit a second electromagnetic wave having a non Orbital Angular Momentum wave-front thru a first portion of the antenna towards the target, where the receiving electronics are structured to form a first signal from a first return wave of the first electromagnetic wave, where the receiving electronics are structured to form a second signal from a second return wave of the second electromagnetic wave, and where the receiving computing electronics are structured to compute target information by using at least one difference between the first signal and the second signal.

System and method for ordered representation and feature extraction for point clouds obtained by detection and ranging sensor

A method is described which includes receiving a point cloud having a plurality of data points each representing a 3D location in a 3D space, the point cloud being obtained using a detection and ranging (DAR) sensor. For each data point, associating the data point with a 3D volume containing the 3D location of the data point, the 3D volume being defined using a 3D lattice that partitions the 3D space based on spherical coordinates. For at least one 3D volume, the data points are sorted within the 3D volume based on at least one dimension of the 3D lattice; and the sorted data points are stored as a set of ordered data points. The method also includes performing feature extraction on the set of ordered data points to generate a set of ordered feature vectors and providing the set of ordered feature vectors to perform a machine learning inference task.

ASSOCIATION OF CAMERA IMAGES AND RADAR DATA IN AUTONOMOUS VEHICLE APPLICATIONS
20230038842 · 2023-02-09 ·

The described aspects and implementations enable fast and accurate object identification in autonomous vehicle (AV) applications by combining radar data with camera images. In one implementation, disclosed is a method and a system to perform the method that includes obtaining a radar image of a first hypothetical object in an environment of the AV, obtaining a camera image of a second hypothetical object in the environment of the AV, and processing the radar image and the camera image using one or more machine-learning models MLMs to obtain a prediction measure representing a likelihood that the first hypothetical object and the second hypothetical object correspond to a same object in the environment of the AV.

Radar device for vehicle, controlling method of radar device and radar system for vehicle
11709261 · 2023-07-25 · ·

The present disclosure relates to a vehicle radar device, a controlling method thereof, and radar system. A radar device according to an embodiment includes a transceiver being controlled to transmit the transmission signal in an operating frequency band according to a selection mode among a plurality of frequency band modes and to receive the reception signal through the receiving antenna, and a mode selector dynamically determining one of the plurality of frequency band modes as the selection mode based on at least one of a target distance to the target and a maximum detection distance for each frequency band. According to embodiments of the present disclosure, the distance resolution of the radar can be optimized by dynamically varying the frequency bandwidth linked with the maximum detection distance according to a target distance under specific driving conditions.

Autonomy first route optimization for autonomous vehicles

Embodiments herein can determine an optimal route for an autonomous electric vehicle. The system may score viable routes between the start and end locations of a trip using a numeric or other scale that denotes how viable the route is for autonomy. The score is adjusted using a variety of factors where a learning process leverages both offline and online data. The scored routes are not based simply on the shortest distance between the start and end points but determine the best route based on the driving context for the vehicle and the user.

Systems and methods for streaming processing for autonomous vehicles
11713006 · 2023-08-01 · ·

Generally, the present disclosure is directed to systems and methods for streaming processing within one or more systems of an autonomy computing system. When an update for a particular object or region of interest is received by a given system, the system can control transmission of data associated with the update as well as a determination of other aspects by the given system. For example, the system can determine based on a received update for a particular aspect and a priority classification and/or interaction classification determined for that aspect whether data associated with the update should be transmitted to a subsequent system before waiting for other updates to arrive.

Technologies for acting based on object tracking
11703593 · 2023-07-18 · ·

This disclosure enables various technologies involving various actions based on tracking an object via a plurality of distance sensors, without synchronizing carrier waves of the distance sensors or without employing a PLL technique on the distance sensors.