G01S17/86

GROUND ENGAGING TOOL WEAR AND LOSS DETECTION SYSTEM AND METHOD

An example wear detection system receives a plurality of images from a plurality of sensors associated with a work machine. Individual sensors of the plurality of sensors have respective fields-of-view different from other sensors of the plurality of sensors. The wear detection system identifies a first region of interest and second region of interest associated with the at least one GET. The wear detection system determines a first set of image points and a second set of images points for the at least one GET based on geometric parameters associated with the GET. The wear detection system determines a wear level or loss for the at least one GET based on the GET measurement.

SENSOR ASSEMBLY WITH LIDAR FOR AUTONOMOUS VEHICLES

A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.

SENSOR ASSEMBLY WITH LIDAR FOR AUTONOMOUS VEHICLES

A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.

SENSOR ASSEMBLY WITH RADAR FOR AUTONOMOUS VEHICLES

A sensor assembly for autonomous vehicles includes a side minor assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.

SENSOR ASSEMBLY WITH RADAR FOR AUTONOMOUS VEHICLES

A sensor assembly for autonomous vehicles includes a side minor assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.

SENSOR ASSEMBLY WITH LIDAR FOR AUTONOMOUS VEHICLES

A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.

SENSOR ASSEMBLY WITH LIDAR FOR AUTONOMOUS VEHICLES

A sensor assembly for autonomous vehicles includes a side mirror assembly configured to mount to a vehicle. The side mirror assembly includes a first camera having a field of view in a direction opposite a direction of forward travel of the vehicle; a second camera having a field of view in the direction of forward travel of the vehicle; and a third camera having a field of view in a direction substantially perpendicular to the direction of forward travel of the vehicle. The first camera, the second camera, and the third camera are oriented to provide, in combination with a fourth camera configured to be mounted on a roof of the vehicle, an uninterrupted camera field of view from the direction of forward travel of the vehicle to a direction opposite the direction of forward travel of the vehicle.

Method and Device for Making Sensor Data More Robust Against Adverse Disruptions

The disclosure relates to a method for making sensor data more robust to adversarial perturbations, wherein sensor data are obtained from at least two sensors, wherein the sensor data obtained from the at least two sensors are replaced in each case piecewise by means of quilting, wherein the piecewise replacement is carried out in such a way that the respectively replaced sensor data from different sensors are plausible relative to one another, and wherein the sensor data replaced piecewise are output.

LOOP CLOSURE DETECTION METHOD AND SYSTEM, MULTI-SENSOR FUSION SLAM SYSTEM, ROBOT, AND MEDIUM
20230045796 · 2023-02-16 ·

The present invention provides a loop closure detection method and system, a multi-sensor fusion SLAM system, a robot, and a medium. Said system runs on a mobile robot, and comprises a similarity detection unit, a visual pose solving unit, and a laser pose solving unit. According to the loop closure detection system, the multi-sensor fusion SLAM system and the robot provided in the present invention, the speed and accuracy of loop closure detection in cases of a change in a viewing angle of the robot, a change in the environmental brightness, a weak texture, etc. can be significantly improved.

CONSTRUCTION SITE DIGITAL FIELD BOOK FOR THREE-DIMENSIONAL SCANNERS
20230047975 · 2023-02-16 ·

A method, system, and computer product that track scanning data acquired by a three-dimensional (3D) coordinate scanner is provided. The method includes storing a digital representation of an environment in memory of a mobile computing device. A first scan is performed with the 3D coordinate scanner in an area of the environment. A location of the first scan is determined on the digital representation. The first scan is registered with the digital representation. The location of the 3D coordinate scanner is indicated on the digital representation at the time of the first scan.