G05D2111/14

Autonomous aerial navigation in low-light and no-light conditions

Autonomous aerial navigation in low-light and no-light conditions includes using night mode obstacle avoidance intelligence and mechanisms for vision-based unmanned aerial vehicle (UAV) navigation to enable autonomous flight operations of a UAV in low-light and no-light environments using infrared data.

Autonomous aerial navigation in low-light and no-light conditions

Autonomous aerial navigation in low-light and no-light conditions includes using night mode obstacle avoidance intelligence, training, and mechanisms for vision-based unmanned aerial vehicle (UAV) navigation to enable autonomous flight operations of a UAV in low-light and no-light environments using infrared data.

Vehicle guidance via infrared projection

A system for guiding a vehicle is provided. The system includes multiple paths on a surface, wherein each path is defined by light projection characteristics of a respective light projection defining a respective path. The system also includes the vehicle. The vehicle includes a sensor configured to detect the light projection characteristic of the respective path of the multiple paths, and a controller guide the vehicle along the respective path with a light projection characteristic that matches an expected light projection characteristic that is assigned to the vehicle.

Adaptive speed control for line-following robots and method thereof

A line-following robot in an assembly line. The line-following robot includes a microcontroller, a camera connected to the microcontroller, disposed on a front of the line-following robot, and configured to collect line images, an Infrared (IR) sensor array connected to the microcontroller, disposed on the line-following robot, and oriented in a direction of travel of the line-following robot, a first wheel set and a second wheel set disposed opposite one another on opposing sides of a bottom of the line-following robot, and a battery. The microprocessor controls a motor speed of the line-following robot based on an upcoming assembly line by continuously capturing and processing the lines images using an advance image processing technique and computer vision techniques.

Celestial navigation system for an autonomous vehicle
12298777 · 2025-05-13 · ·

A navigation control system for an autonomous vehicle comprises a transmitter and an autonomous vehicle. The transmitter comprises an emitter for emitting at least one signal, a power source for powering the emitter, a device for capturing wireless energy to charge the power source, and a printed circuit board for converting the captured wireless energy to a form for charging the power source. The autonomous vehicle operates within a working area and comprises a receiver for detecting the at least one signal emitted by the emitter, and a processor for determining a relative location of the autonomous vehicle within the working area based on the signal emitted by the emitter.

Autonomous Control Of Powered Earth-Moving Vehicles To Control Calibration Operations For On-Vehicle Sensors
20250171983 · 2025-05-29 ·

Systems and techniques are described for implementing autonomous control of powered earth-moving vehicles, including to automatically calibrate sensors on a powered earth-moving vehicle, such as to determine position and orientation of directional sensors on movable vehicle parts. For example, an on-vehicle sensor to be calibrated may include a LIDAR sensor located on the powered earth-moving vehicle, such as on a movable component part of the vehicle (e.g., a hydraulic arm, a tool attachment, etc.), and a global common frame of reference is determined for different datasets gathered at different times from such a sensor in order to combine or compare the datasets, such as by determining the sensor position in 3D space at a time of dataset gathering (e.g., relative to another reference point on the vehicle with a known location in the global common frame of reference, such as by using one or more determined transforms).

Component calibration using motion capture sensor data

Provided are methods for using motion capture sensors to calibrate a component, which can include receiving motion capture sensor data associated with a motion capture sensor, the motion capture sensor data comprising at least a location of a motion capture marker and a location of a reference point on a vehicle, determining a position of a hardware component associated with the vehicle relative to the reference point on the vehicle based at least in part on the motion capture sensor data, determining the position of the hardware component does not satisfy a calibration threshold associated with the hardware component, determining a hardware component alert associated with the hardware component based at least in part on the determining that the position of the hardware component does not satisfy the calibration threshold associated with the hardware component, and routing the hardware component alert. Systems and computer program products are also provided.

AUTONOMOUS MOBILE ROBOT WITH SAFETY DEPTH CAMERA

Various aspects of techniques, systems, and use cases may be used for using a safety depth camera for controlling an autonomous mobile robot. An example technique may include receiving infrared data from at least two infrared receivers of a safety depth camera affixed to a robotic system, determining a safety status of the robotic system related to a detected object in an environment based on the infrared data, and sending an indication to at least one of emergency braking circuitry of the robotic system or adjustable braking circuitry of the robotic system based on the safety status.

SYSTEM AND METHOD FOR UTILIZING SWIR SENSING IN CLEANING MACHINES
20250221593 · 2025-07-10 ·

A system and method for utilizing SWIR sensing in automated cleaning machines including a first illumination source for emitting radiation at a first wavelength in the SWIR towards a FOV, a receiver operating in the SWIR to acquire SWIR image data based on radiation reflected from elements located in the FOV and determining, based on the SWIR image data, presence of water-based liquids in the FOV.

Method for controlling robot cleaner
12402767 · 2025-09-02 · ·

The present disclosure relates to a method for controlling a robot cleaner and, more particularly, to a method for controlling a robot cleaner including a plurality of obstacle detection sensors, the method for controlling the robot cleaner to execute the operations comprising: a first step for approaching a charging device for the robot cleaner according to an IR signal transmitted from the charging device for the robot cleaner including a charging terminal for charging the robot cleaner; a second step for checking whether the shortest distance between the robot cleaner and the charging device for the robot cleaner is less than or equal to a first distance through the plurality of obstacle detection sensors; a third step for checking whether the center lines of the robot cleaner and the charging device for the robot cleaner are aligned through the plurality of obstacle detection sensors, and aligning the center lines of the robot cleaner and the charging device for the robot cleaner; and a fourth step for, when the center lines of the robot cleaner and the charging device for the robot cleaner have been aligned, moving the robot cleaner straight toward the charging device for the robot cleaner.