Patent classifications
G01S13/60
ENHANCED DOPPLER RADAR SYSTEMS AND METHODS
Techniques are disclosed for systems and methods to provide remote sensing imagery for mobile structures. A remote sensing imagery system includes a radar assembly mounted to a mobile structure and a coupled logic device. The radar assembly includes an orientation and position sensor (OPS) coupled to or within the radar assembly and configured to provide orientation and position data associated with the radar assembly. The logic device is configured to receive radar returns corresponding to a detected target from the radar assembly and orientation and/or position data corresponding to the radar returns from the OPS, determine a target radial speed corresponding to the detected target, and then generate remote sensor image data based on the remote sensor returns and the target radial speed. Subsequent user input and/or the sensor data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
LEARNING DATA COLLECTING SYSTEM, METHOD OF COLLECTING LEARNING DATA, AND ESTIMATING DEVICE
The present disclosure provides a learning data collecting system which is easy to collect learning data. The learning data collecting system includes a radar, a communication device, and processing circuitry. The radar receives a reflection wave of a radio wave transmitted around a ship and generates echo data associated with a direction. The communication device receives travel data of an other ship containing position data of the other ship. The processing circuitry extracts, from the echo data, partial echo data of an area corresponding to the position data.
RADAR APPARATUS
In a radar apparatus, a determining unit compares a ratio (−Vr/Vn) of a relative velocity (Vr) to a radar-apparatus-installed vehicle velocity (Vn) with a determination value (a) that is a cosine (cos θc) of the detection limit angle (±θc) or the cosine (cos θc) plus a correction value including a measurement error. When a determination is made that the ratio (−Vr/Vn) exceeds the determination value (α), a target is determined to be a real target of a crossing object, such as a crossing pedestrian, or a stationary object, whereas when a determination is made that the target is not a real target, the target is determined to be a ghost of a crossing object or a stationary object. Thus, a real target of a crossing object or a stationary object can be distinguished from a ghost of the object, and hence a ghost is not falsely determined to be a real target, preventing inappropriate brake control.
RADAR APPARATUS
In a radar apparatus, a determining unit compares a ratio (−Vr/Vn) of a relative velocity (Vr) to a radar-apparatus-installed vehicle velocity (Vn) with a determination value (a) that is a cosine (cos θc) of the detection limit angle (±θc) or the cosine (cos θc) plus a correction value including a measurement error. When a determination is made that the ratio (−Vr/Vn) exceeds the determination value (α), a target is determined to be a real target of a crossing object, such as a crossing pedestrian, or a stationary object, whereas when a determination is made that the target is not a real target, the target is determined to be a ghost of a crossing object or a stationary object. Thus, a real target of a crossing object or a stationary object can be distinguished from a ghost of the object, and hence a ghost is not falsely determined to be a real target, preventing inappropriate brake control.
Method of determining position of vehicle and vehicle using the same
Provided is an autonomous vehicle including a storage configured to store a map including two-dimensionally represented road surface information and three-dimensionally represented structure information, a camera configured to obtain a two-dimensional (2D) image of a road surface in a vicinity of the vehicle, a light detection and ranging (LiDAR) unit configured to obtain three-dimensional (3D) spatial information regarding structures in a vicinity of the vehicle, and a controller comprising processing circuitry configured to determine at least one of the camera or the LiDAR unit as a position sensor, based on whether it is possible to obtain information regarding the road surface and/or the structures in the vicinity of the vehicle, to identify a position of the vehicle on the map corresponding to a current position of the vehicle using the position sensor, and performing autonomous driving based on the identified position on the map.
METHOD AND SYSTEM FOR RADAR-BASED ODOMETRY
An odometry solution for a device within a moving platform is provided using a deep neural network. Radar measurements may be obtained, such that static objects are detected based at least in part on the obtained radar measurements. Odometry information for the platform is estimated based at least in part on the detected static objects and the obtained radar measurements.
Systems and Methods for End-to-End Trajectory Prediction Using Radar, Lidar, and Maps
Systems and methods for trajectory prediction are provided. A method can include obtaining LIDAR data, radar data, and map data; inputting the LIDAR data, the radar data, and the map data into a network model; transforming, by the network model, the radar data into a coordinate frame associated with a most recent radar sweep in the radar data; generating, by the network model, one or more features for each of the LIDAR data, the transformed radar data, and the map data; combining, by the network model, the one or more generated features to generate fused feature data; generating, by the network model, prediction data based at least in part on the fused feature data; and receiving, as an output of the network model, the prediction data. The prediction data can include a respective predicted trajectory for a future time period for one or more detected objects.
METHOD AND SYSTEM FOR MEASUREMENT OF THE SPEED OF A COMPETITOR ON A RACE COURSE
The method makes it possible to measure a speed of a competitor, such as a skier or snowboarder, on a ski or snowboard course, by means of a measurement system with a transponder module worn by the competitor. The method includes the step of activating the transponder module which includes a measurement unit with at least one measurement sensor, as soon as the race starts, a step of measuring the speed of the competitor via the measurement unit during the race along the course, and a step of transmitting the competitor's speed measurement to at least one base station of a decoder device of the measurement system for a timing device in order to display the speed of the competitor in real time or continuously on at least one screen.
SYSTEM AND METHOD FOR AUTOMOTIVE RADAR SENSOR ORIENTATION ESTIMATION USING RADAR DETECTION INFORMATION OF ARBITRARY DETECTIONS
A mechanism is provided for estimating mounting orientation yaw and pitch of a radar sensor without need of prior knowledge or information from any other sensor on an automobile. Embodiments estimate the sensor heading (e.g., azimuth) due to movement of the automobile from radial relative velocities and azimuths of radar target detections. This can be performed at every system cycle, when a new radar detection occurs. Embodiments then can estimate the sensor mounting orientation (e.g., yaw) from multiple sensor heading estimations. For further accuracy, embodiments can also take into account target elevation measurements to either more accurately determine sensor azimuth and yaw or to also determine mounting pitch orientation.
SYSTEM AND METHOD FOR AUTOMOTIVE RADAR SENSOR ORIENTATION ESTIMATION USING RADAR DETECTION INFORMATION OF ARBITRARY DETECTIONS
A mechanism is provided for estimating mounting orientation yaw and pitch of a radar sensor without need of prior knowledge or information from any other sensor on an automobile. Embodiments estimate the sensor heading (e.g., azimuth) due to movement of the automobile from radial relative velocities and azimuths of radar target detections. This can be performed at every system cycle, when a new radar detection occurs. Embodiments then can estimate the sensor mounting orientation (e.g., yaw) from multiple sensor heading estimations. For further accuracy, embodiments can also take into account target elevation measurements to either more accurately determine sensor azimuth and yaw or to also determine mounting pitch orientation.