G01S13/874

Improved Lift Detection for a Robotic Work Tool
20220022371 · 2022-01-27 ·

A robotic work tool system (200) comprising a robotic worktool (100) comprising a distance sensor (190, 190′, 190″), the robotic work tool (100) being configured to determine a sensed distance (SD) to a surface travelled (G); determine whether the sensed distance is greater than a threshold distance; and if so detect a lift event.

Method and device for determining an exact position of a vehicle with the aid of radar signatures of the vehicle surroundings

A method and device for determining a first highly precise position of a vehicle. The method includes acquiring surrounding-area data values using at least one radar sensor of the vehicle, the surrounding-area data values representing a surrounding area of the vehicle; and determining a rough position of the vehicle as a function of the acquired surrounding area data values. In addition, the method includes determining surrounding-area feature data values as a function of the determined rough position of the vehicle, the surrounding-area feature data values representing at least one surrounding-area feature and a second highly precise position of the at least one surrounding-area feature; and determining the first highly precise position of the vehicle as a function of the at least one surrounding-area feature, according to predefined localization criteria, the first highly precise position of the vehicle being more precise than the rough position of the vehicle.

UNMANNED AERIAL VEHICLE AND LOCALIZATION METHOD FOR UNMANNED AERIAL VEHICLE

An unmanned aerial vehicle aerial vehicle and a localization method for an unmanned aerial vehicle aerial vehicle are described. In an embodiment, an unmanned aerial vehicle comprises: a first ultra-wide band node; a second ultra-wide band node; and a localization processor configured to use signals received at the first ultra-wide band node and the second ultra-wide band node from a plurality of anchor nodes located within an environment to estimate a pose state of the unmanned aerial vehicle, wherein the first ultra-wide band node and the second ultra-wide band node are arranged on the unmanned aerial vehicle at positions offset from one another.

Multi-sensor data fusion-based aircraft detection, tracking, and docking

Tracking aircraft in and near a ramp area is described herein. One method includes receiving camera image data of an aircraft while the aircraft is approaching or in the ramp area, receiving LIDAR/Radar sensor data of an aircraft while the aircraft is approaching or in the ramp area, merging the camera image data and the LIDAR/Radar sensor data into a merged data set, and wherein the merged data set includes at least one of: data for determining the position and orientation of the aircraft relative to the position and orientation of the ramp area, data for determining speed of the aircraft, data for determining direction of the aircraft, data for determining proximity of the aircraft to a particular object within the ramp area, and data for forming a three dimensional virtual model of at least a portion of the aircraft from the merged data.

Vehicle Localization Based on Radar Detections in Garages
20230360529 · 2023-11-09 ·

This document describes techniques and systems for vehicle localization based on radar detections in garages and other GNSS denial environments. In some examples, a system includes a processor and computer-readable storage media comprising instructions that, when executed, cause the system to obtain structure data regarding a GNSS denial environment and generate, from the structure data, radar localization landmarks. The radar localization landmarks include edges or corners of the GNSS denial environment. The instructions also cause the processor to generate polylines along or between the radar localization landmarks to generate a radar occupancy grid. The instructions further cause the processor to receive radar detections from one or more radar sensors and obtain a vehicle pose within the radar occupancy grid to localize the vehicle within the GNSS denial environment. In this way, the system can provide highly accurate vehicle localization in GNSS denial environments in a cost-effective manner.

Vehicle localization based on radar detections in garages

This document describes techniques and systems for vehicle localization based on radar detections in garages and other GNSS denial environments. In some examples, a system includes a processor and computer-readable storage media comprising instructions that, when executed, cause the system to obtain structure data regarding a GNSS denial environment and generate, from the structure data, radar localization landmarks. The radar localization landmarks include edges or corners of the GNSS denial environment. The instructions also cause the processor to generate polylines along or between the radar localization landmarks to generate a radar occupancy grid. The instructions further cause the processor to receive radar detections from one or more radar sensors and obtain a vehicle pose within the radar occupancy grid to localize the vehicle within the GNSS denial environment. In this way, the system can provide highly accurate vehicle localization in GNSS denial environments in a cost-effective manner.

SENSOR DATA COMPRESSION FOR MAP CREATION FOR AUTONOMOUS SYSTEMS AND APPLICATIONS
20230296757 · 2023-09-21 ·

One or more embodiments of the present disclosure may relate to communicating RADAR (RAdio Detection And Ranging) data to a distributed map system that is configured to generate map data based on the RADAR data. In these or other embodiments, certain compression operations may be performed on the RADAR data to reduce the amount of data that is communicated from the ego-machines to the map system.

SENSOR DATA BASED MAP CREATION FOR AUTONOMOUS SYSTEMS AND APPLICATIONS

One or more embodiments of the present disclosure relate to generation of map data. In these or other embodiments, the generation of the map data may include determining whether objects indicated by the sensor data are static objects or dynamic objects. Additionally or alternatively, sensor data may be removed or included in the map data based on determinations as to whether it corresponds to static objects or dynamic objects.

SENSOR DATA BASED MAP CREATION AND LOCALIZATION FOR AUTONOMOUS SYSTEMS AND APPLICATIONS

One or more embodiments of the present disclosure relate to aligning sensor data. In some embodiments, the aligning may be used for performing localization. In these or other embodiments, the aligning may be used for map creation.

SENSOR DATA BASED MAP CREATION AND LOCALIZATION FOR AUTONOMOUS SYSTEMS AND APPLICATIONS

One or more embodiments of the present disclosure relate to generating RADAR (RAdio Detection And Ranging) point clouds based on RADAR data obtained from one or more RADAR sensors disposed on one or more ego-machines. In these or other embodiments, the RADAR point clouds may be communicated to a distributed map system that is configured to generate map data based on the RADAR point clouds. In some embodiments of the present disclosure, certain compression operations may be performed on the RADAR point clouds to reduce the amount of data that is communicated from the ego-machines to the map system.