G05D1/0253

Systems and methods for vehicle position calibration using rack leg identification and mast sway compensation

A materials handling vehicle includes a camera, odometry module, processor, and drive mechanism. The camera captures images of an identifier for a racking system aisle and a rack leg portion in the aisle. The processor uses the identifier to generate information indicative of an initial rack leg position and rack leg spacing in the aisle, generate an initial vehicle position using the initial rack leg position, generate a vehicle odometry-based position using odometry data and the initial vehicle position, detect a subsequent rack leg using a captured image, correlate the detected subsequent rack leg with an expected vehicle position using rack leg spacing, generate an odometry error signal based on a difference between the positions, and update the vehicle odometry-based position using the odometry error signal and/or generated mast sway compensation to use for end of aisle protection and/or in/out of aisle localization.

SELF-POSITION ESTIMATION MODEL LEARNING METHOD, SELF-POSITION ESTIMATION MODEL LEARNING DEVICE, RECORDING MEDIUM STORING SELF-POSITION ESTIMATION MODEL LEARNING PROGRAM, SELF-POSITION ESTIMATION METHOD, SELF-POSITION ESTIMATION DEVICE, RECORDING MEDIUM STORING SELF-POSITION ESTIMATION PROGRAM, AND ROBOT
20220397903 · 2022-12-15 · ·

A self-position estimation model learning device (10) includes: an acquisition unit (30) that acquires, in time series, a local image captured from a viewpoint of a self-position estimation subject in a dynamic environment, and a bird's-eye view image which is captured from a location overlooking the self-position estimation subject and is synchronized with the local image; and a learning unit (32) for learning a self-position estimation model that takes the local image and the bird's-eye view image acquired in time series as input, and outputs the position of the self-position estimation subject.

Camera-based commissioning

Lighting control systems may be commissioned for programming and/or control with the aid of a mobile device. Design software may be used to create a floor plan of how the lighting control system may be designed. The design software may generate floor plan identifiers for each lighting fixture, or group of lighting fixtures. During commissioning of the lighting control system, the mobile device may be used to help identify the lighting devices that have been installed in the physical space. The mobile device may receive a communication from each lighting control device that indicates a unique identifier of the lighting control device. The unique identifier may be communicated by visible light communication (VLC) or RF communication. The unique identifier may be associated with the floor plan identifier for communication of digital messages to lighting fixtures installed in the locations indicated in the floor plan identifier.

Image-based velocity control for a turning vehicle

An autonomous vehicle control system is provided. The control system may include a plurality of cameras to acquire a plurality of images of an area in a vicinity of a vehicle; and at least one processing device configured to: recognize a curve to be navigated based on map data and vehicle position information; determine an initial target velocity for the vehicle based on at least one characteristic of the curve as reflected in the map data; adjust a velocity of the vehicle to the initial target velocity; determine, based on the plurality of images, observed characteristics of the curve; determine an updated target velocity based on the observed characteristics of the curve; and adjust the velocity of the vehicle to the updated target velocity.

Vehicle trajectory prediction using road topology and traffic participant object states
11591012 · 2023-02-28 · ·

System, method, and device for controlling a vehicle. In one example, the system includes an electronic processor configured to capture, via a camera, a first image, determine, within the first image, a road traffic factor, and generate, based on sensor information from one or more sensors of the vehicle, a second image depicting an environment surrounding the vehicle. The second image includes the road traffic factor. The electronic processor is also configured to, determine, based on the detected road traffic factor and the second image, a predicted trajectory of a traffic participant proximate to the vehicle, and generate a steering command for the vehicle based on the predicted trajectory.

Method, system and apparatus for localization-based historical obstacle handling

A method of obstacle handling for a mobile automation apparatus includes: obtaining an initial localization of the mobile automation apparatus in a frame of reference; detecting an obstacle by one or more sensors disposed on the mobile automation apparatus; generating and storing an initial location of the obstacle in the frame of reference, based on (i) the initial localization, and (ii) a detected position of the obstacle relative to the mobile automation apparatus; obtaining a correction to the initial localization of the mobile automation apparatus; and applying a positional adjustment, based on the correction, to the initial position of the obstacle to generate and store an updated position of the obstacle.

SENSOR DATA SEGMENTATION
20230055888 · 2023-02-23 ·

A system may include one or more processors configured to receive a plurality of images representing an environment. The images may include image data generated by an image capture device. The processors may also be configured to transmit the image data to an image segmentation network configured to segment the images. The processors may also be configured to receive sensor data associated with the environment including sensor data generated by a sensor of a type different than an image capture device. The processors may be configured to associate the sensor data with segmented images to create a training dataset. The processors may be configured to transmit the training dataset to a machine learning network configured to run a sensor data segmentation model, and train the sensor data segmentation model using the training dataset, such that the sensor data segmentation model is configured to segment sensor data.

TEMPORARY RULE SUSPENSION FOR AUTONOMOUS NAVIGATION
20220363248 · 2022-11-17 ·

A navigation system for a host vehicle is provided. The system may comprise at least one processing device comprising circuitry and a memory. The memory includes instructions that when executed by the circuitry cause the at least one processing device to: receive a plurality of images acquired by a camera, the plurality of images being representative of an environment of the host vehicle; analyze the plurality of images to identify a presence in the environment of the host vehicle a navigation rule suspension condition; temporarily suspend at least one navigational rule in response to identification of the navigation rule suspension condition; and cause at least one navigational change of the host vehicle unconstrained by the temporarily suspended at least one navigational rule.

Vehicle sensor fusion
11585920 · 2023-02-21 · ·

Various systems and methods for optimizing use of environmental and operational sensors are described herein. A system for improving sensor efficiency includes object recognition circuitry implementable in a vehicle to detect an object ahead of the vehicle, the object recognition circuitry configured to use an object detection operation to detect the object from sensor data of a sensor array, and the object recognition circuitry configured to use at least one object tracking operation to track the object between successive object detection operations; and a processor subsystem to: calculate a relative velocity of the object with respect to the vehicle; and configure the object recognition circuitry to adjust intervals between successive object detection operations based on the relative velocity of the object.

CAMERA-BASED COMMISSIONING

Lighting control systems may be commissioned for programming and/or control with the aid of a mobile device. Design software may be used to create a floor plan of how the lighting control system may be designed. The design software may generate floor plan identifiers for each lighting fixture, or group of lighting fixtures. During commissioning of the lighting control system, the mobile device may be used to help identify the lighting devices that have been installed in the physical space. The mobile device may receive a communication from each lighting control device that indicates a unique identifier of the lighting control device. The unique identifier may be communicated by visible light communication (VLC) or RF communication. The unique identifier may be associated with the floor plan identifier for communication of digital messages to lighting fixtures installed in the locations indicated in the floor plan identifier.