Patent classifications
G05D2111/67
Mobile robot using artificial intelligence and controlling method thereof
A mobile robot of the present disclosure includes: a traveling unit configured to move a main body; a lidar sensor configured to acquire terrain information outside the main body; a camera sensor configured to acquire an image outside the main body; and a controller configured to fuse the image and a detection signal of the lidar sensor to select a front edge for the next movement and set a target location of the next movement at the front edge to perform mapping travelling. Therefore, in a situation where there is no map, the mobile robot can provide an accurate map with a minimum speed change when travelling while drawing the map.
Microlensing for real-time sensing of stray light
Example embodiments relate to microlensing for real-time sensing of stray light. An example device includes an image sensor that includes a plurality of light-sensitive pixels. The device also includes a first lens positioned over a first subset of light-sensitive pixels selected from the plurality of light-sensitive pixels. Further, the device includes a controller. The controller is configured to determine a first angle of incidence of a first light signal detected by the first subset of light-sensitive pixels. The controller is also configured to, based on the first determined angle of incidence, determine an amount of stray light incident on the image sensor.
Autonomous Travel Support Device, Control Method Of Working Vehicle, Program, And Working Vehicle
An autonomous travel support device is provided that is structurally and electrically attachable to and detachable from a working vehicle. The working vehicle is configured to travel in response to a first operation input by a user. The autonomous travel support device comprises a processor configured to execute a reception step of receiving a switching input being an input for switching a travel mode of the working vehicle to a first mode or a second mode; and a vehicle control step of causing the working vehicle to travel in response to the first operation input in a case where the travel mode is set to the first mode, and of causing the working vehicle to travel autonomously using a preset autonomous travel model in a case where the travel mode is set to the second mode.
AUTOMATIC MULTI-MODALITY SENSOR CALIBRATION WITH NEAR-INFRARED IMAGES
Systems and methods for automatic multi-modality sensor calibration with near-infrared images (NIR). Image keypoints from collected images and NIR keypoints from NIR can be detected. A deep-learning-based neural network that learns relation graphs between the image keypoints and the NIR keypoints can match the image keypoints and the NIR keypoints. Three dimensional (3D) points from 3D point cloud data can be filtered based on corresponding 3D points from the NIR keypoints (NIR-to-3D points) to obtain filtered NIR-to-3D points. An extrinsic calibration can be optimized based on a reprojection error computed from the filtered NIR-to-3D points to obtain an optimized extrinsic calibration for an autonomous entity control system. An entity can be controlled by employing the optimized extrinsic calibration for the autonomous entity control system.
SYSTEMS AND METHODS OF SENSOR DATA FUSION
Systems and methods of sensor data fusion including sensor data capture, curation, linking, fusion, inference, and validation. The systems and methods described herein reduce computational demand and processing time by curating data and calculating conditional entropy. The system is operable to fuse data from a plurality of sensor types. A computer processor optionally stores fused sensor that the system validates above a mathematical threshold.
DYNAMIC OCCUPANCY GRID ARCHITECTURE
Techniques are provided for utilizing a dynamic occupancy grid (DoG) for tracking objects proximate to an autonomous or semi-autonomous vehicle. An example method for generating an object track list in a vehicle includes obtaining sensor information from one or more sensors on the vehicle, determining a first set of object data based at least in part on the sensor information and an object recognition process, generating a dynamic grid based on an environment proximate to the vehicle based at least in part on the sensor information, determining a second set of object data based at least in part on the dynamic grid, and outputting the object track list based on a fusion of the first set of object data and the second set of object data.
Systems and methods of sensor data fusion
Systems and methods of sensor data fusion including sensor data capture, curation, linking, fusion, inference, and validation. The systems and methods described herein reduce computational demand and processing time by curating data and calculating conditional entropy. The system is operable to fuse data from a plurality of sensor types. A computer processor optionally stores fused sensor data that the system validates above a mathematical threshold.
SENSOR MEASUREMENT GRID COMPLEXITY MANAGEMENT
Techniques are provide for generating occupancy grids based on inputs from multiple heterogeneous sensors. An example method for generating an occupancy grid includes obtaining detection information from a plurality of heterogeneous sensors, generating a single measurement grid based on the detection information from the plurality of heterogeneous sensors, determining occupancy probabilities for a plurality of cells in the single measurement grid, and outputting the occupancy grid based at least in part on the occupancy probabilities.
SYSTEMS AND METHODS OF SENSOR DATA FUSION
Systems and methods of sensor data fusion including sensor data capture, curation, linking, fusion, inference, and validation. The systems and methods described herein reduce computational demand and processing time by curating data and calculating conditional entropy. The system is operable to fuse data from a plurality of sensor types. A computer processor optionally stores fused sensor data that the system validates above a mathematical threshold.
SYSTEMS AND METHODS OF SENSOR DATA FUSION
Systems and methods of sensor data fusion including sensor data capture, curation, linking, fusion, inference, and validation. The systems and methods described herein reduce computational demand and processing time by curating data and calculating conditional entropy. The system is operable to fuse data from a plurality of sensor types. A computer processor optionally stores fused sensor data that the system validates above a mathematical threshold.