G05D1/2437

Self-location estimation device, autonomous mobile body, and self-location estimation method

A self-location estimation device includes a first self-location estimation unit, a second self-location estimation unit, and a first integration unit. The first self-location estimation unit estimates a current first self-location of an autonomous mobile body based on current image information acquired by an image sensor and environmental map information stored in an environmental map information storage unit. The second self-location estimation unit estimates a current second self-location of the autonomous mobile body based on the current image information and a learned parameter learned using the environmental map information. The first integration unit estimates a current self-location of the autonomous mobile body by integrating the first self-location and the second self-location.

AUTONOMOUS INVENTORY MANAGEMENT USING BATTERY SWAPPING DRONES
20250060748 · 2025-02-20 ·

A computing device may generate a command for performing inventory management of a storage site based on inventory management data. The computing device may cause the inventory aerial robot to perform an inventory management trip based on the command. The inventory management trip may include receiving an input that includes coordinates of a plurality of target locations in the storage site, departing from a base station, navigating through the storage site to the plurality of target locations, capturing an image associated with an inventory item location at the one of the target locations, and returning to the base station. The computing device may perform an analysis of the image captured by the inventory aerial robot. The base station may perform swapping of a battery pack of the inventory aerial robot at the base station to prepare the inventory aerial robot for another inventory management trip.

Controlling movements of a robot running on tracks

A method for controlling movement of a robot on a plurality of tracks laid out on a frame structure forming a grid includes detecting a current speed and an angular position for one wheel in a pair of wheels of the robot; tracking a current position of the robot relative to at least a portion of the frame structure; and setting a driving sequence for the pair of wheels, based on a position information of the robot.

Actively modifying a field of view of an autonomous vehicle in view of constraints
12230140 · 2025-02-18 · ·

Methods and devices for actively modifying a field of view of an autonomous vehicle in view of constraints are disclosed. In one embodiment, an example method is disclosed that includes causing a sensor in an autonomous vehicle to sense information about an environment in a first field of view, where a portion of the environment is obscured in the first field of view. The example method further includes determining a desired field of view in which the portion of the environment is not obscured and, based on the desired field of view and a set of constraints for the vehicle, determining a second field of view in which the portion of the environment is less obscured than in the first field of view. The example method further includes modifying a position of the vehicle, thereby causing the sensor to sense information in the second field of view.

Occupancy verification device and method
12228939 · 2025-02-18 · ·

An occupancy verification including one or more processors, configured to receive occupancy grid data representing a plurality of cells of an occupancy grid and for each cell of the plurality of cells, a probability of whether the cell is occupied; receive object location data, representing one or more locations of one or more objects identified using a model; map the object location data to one or more cells of the plurality of cells based on the one or more locations of the one or more objects; determine comparison grid data, representing the plurality of cells of the occupancy grid and for each cell of the plurality of cells, a probability of whether the cell is occupied based on whether the object location data are mapped to the cell; and compare the occupancy grid data to the comparison grid data using at least one comparing rule.

Free space estimator for autonomous movement

One or more embodiments herein can enable identification of an obstacle free area about an object. An exemplary system can comprise a memory that stores computer executable components, and a processor that executes the computer executable components stored in the memory, wherein the computer executable components can comprise an obtaining component that obtains raw data defining a physical state of an environment around an object from a vantage of the object, and a generation component that, based on the raw data, generates a dimension of a portion or more of a virtual polygon representing a boundary about the object, wherein the boundary bounds free space about the object. A sensing sub-system can comprise both an ultrasonic sensor and a camera that can separately sense the environment about the object from the vantage of the object to thereby generate separate polygon measurement sets.

Systems and methods for navigating a vehicle among encroaching vehicles

Systems and methods use cameras to provide autonomous navigation features. In one implementation, a method for navigating a user vehicle may include acquiring, using at least one image capture device, a plurality of images of an area in a vicinity of the user vehicle; determining from the plurality of images a first lane constraint on a first side of the user vehicle and a second lane constraint on a second side of the user vehicle opposite to the first side of the user vehicle; enabling the user vehicle to pass a target vehicle if the target vehicle is determined to be in a lane different from the lane in which the user vehicle is traveling; and causing the user vehicle to abort the pass before completion of the pass, if the target vehicle is determined to be entering the lane in which the user vehicle is traveling.

Systems and methods for navigating a vehicle to a default lane

Systems and methods use cameras to provide autonomous navigation features. In one implementation, a driver assist navigation system is provided for a vehicle. The system may include at least one image capture device configured to acquire a plurality of images of an area in a vicinity of the vehicle; a data interface; and at least one processing device. The at least one processing device may be configured to: receive the plurality of images via the data interface; determine from the plurality of images a current lane of travel from among a plurality of available travel lanes; and cause the vehicle to change lanes if the current lane of travel is not the same as a predetermined default travel lane.

UNMANNED AERIAL VEHICLE WITH IMMUNITY TO HIJACKING, JAMMING, AND SPOOFING ATTACKS
20250087101 · 2025-03-13 ·

An unmanned aerial vehicle (UAV) or drone executes a neural network to assist with detecting and responding to attacks. The neural network may monitor, in real time, the data stream from a plurality of onboard sensors during navigation and may communicate with a high-altitude pseudosatellite (HAPS) platform For example, if the neural network detects a cyber-attack but determines that it does not interfere with external communications, it may shift navigation control of the drone to the HAPS.

CAMERA-BASED COMMISSIONING AND CONTROL OF DEVICES IN A LOAD CONTROL SYSTEM

Lighting control systems may be commissioned for programming and/or control with the aid of an autonomous mobile device. Design software may be used to create a floor plan of how the lighting control system may be designed. The design software may generate floor plan identifiers for each lighting fixture, or group of lighting fixtures. During commissioning of the lighting control system, the autonomous mobile device may be used to help identify the lighting devices that have been installed in the physical space. The autonomous mobile device may receive a communication from each lighting control device that indicates a unique identifier of the lighting control device. The unique identifier may be communicated by visible light communication (VLC) or RF communication. The unique identifier may be associated with the floor plan identifier for communication of digital messages to lighting fixtures installed in the locations indicated in the floor plan identifier.