G05D1/247

Continuous convolution and fusion in neural networks

Systems and methods are provided for machine-learned models including convolutional neural networks that generate predictions using continuous convolution techniques. For example, the systems and methods of the present disclosure can be included in or otherwise leveraged by an autonomous vehicle. In one example, a computing system can perform, with a machine-learned convolutional neural network, one or more convolutions over input data using a continuous filter relative to a support domain associated with the input data, and receive a prediction from the machine-learned convolutional neural network. A machine-learned convolutional neural network in some examples includes at least one continuous convolution layer configured to perform convolutions over input data with a parametric continuous kernel.

Predictive vehicle acquisition

A computer is programmed to instruct a vehicle to navigate to a pickup location based on a current user location received from a first device and data about a user event received from a second device.

Calibration of vehicle sensor array alignment
11878632 · 2024-01-23 · ·

In one embodiment, a method includes, receiving a first set of positional parameters for movement of a vehicle through an environment captured by a first sensor associated with the vehicle. The method includes receiving a second set of positional parameters for movement of the vehicle through the environment captured by a second sensor associated with the vehicle. The method includes calculating an angular offset between the first set of positional parameters and the second set of positional parameters based on comparing the first set of positional parameters to the second set of positional parameters. The method includes determining a calibration factor based on the calculated angular offset. The method includes calibrating at least one of the first sensor or the second sensor by using the calibration factor.

Reliable visual markers based on multispectral characteristics
11881033 · 2024-01-23 · ·

Disclosed herein are methods and systems for painting driving assistance markings using one or more paint materials which are visible in a plurality of light spectral ranges, in particular, visible light and one or more infrared light spectral ranges. Further disclosed are methods and systems for analyzing images captured in multiple spectral ranges to identify the driving assistance markings and/or part thereof in a plurality of different spectral ranges and identify aggregated driving assistance markings by aggregating the driving assistance markings identified in the plurality of different spectral ranges. Also disclosed herein are methods and systems for presenting and detecting enhanced driving assistance markings on one or more elements under one or more paint materials which are highly transparent in one or more infrared spectral ranges while reflecting visible light conforming to a color of the element(s) surface thus not affecting appearance of the element(s) in the visible light spectrum.

Method of automatically combining farm vehicle and work machine and farm vehicle

Provided is a method of automatically combining a farm vehicle with a work machine including confirming a current position of the work machine, moving a farm vehicle into a range having a predetermined radius around the current position, and controlling the farm vehicle, on the basis of a current position and direction of a first coupling unit included in the work machine, so that the first coupling unit and a second coupling unit included in the farm vehicle are coupled to each other.

Methods and apparatus for using scene-based metrics to gate readiness of autonomous systems
11886193 · 2024-01-30 · ·

According to one aspect, a method is provided to determine whether an autonomous system is ready to be deployed or is otherwise ready for use, scene-based metrics, or metrics based on instances of scenarios. Scene-based metrics are mapped, or otherwise translated, to distance-based metrics such that substantially standard distance-based metrics may be used to gate the readiness of an autonomy system for deployment.

Operator assistance for autonomous vehicles

Disclosed are autonomous vehicles that may autonomously navigate at least a portion of a route defined by a service request allocator. The autonomous vehicle may, at a certain portion of the route, request remote assistance. In response to the request, an operator may provide input to a console that indicates control positions for one or more vehicle controls such as steering position, brake position, and/or accelerator position. A command is sent to the autonomous vehicle indicating how the vehicle should proceed along the route. When the vehicle reaches a location where remote assistance is no longer required, the autonomous vehicle is released from manual control and may then continue executing the route under autonomous control.

Multi-scale driving environment prediction with hierarchical spatial temporal attention

In accordance with one embodiment of the present disclosure, method includes obtaining multi-level environment data corresponding to a plurality of driving environment levels, encoding the multi-level environment data at each level, extracting features from the multi-level environment data at each encoded level, fusing the extracted features from each encoded level with a spatial-temporal attention framework to generate a fused information embedding, and decoding the fused information embedding to predict driving environment information at one or more driving environment levels.

System for monitoring stability of operation of autonomous robots

System for monitoring stability of autonomous robot, including a GNSS navigation receiver including antenna, analog front end, plurality of channels, inertial measurement unit (IMU) and a processor, all generating navigation and orientation data for the robot; based on the navigation and orientation data, calculating position and direction of movement for the robot; calculating spatial and orientation coordinates z.sub.1, z.sub.2 of the robot, relating to the position and direction of movement; continuing with programmed path for the robot for any spatial and orientation coordinates z.sub.1, z.sub.2 within an attraction domain, where a measure V(z) of distance from zero in z.sub.1, z.sub.2 plane are defined by Lurie-Postnikov functions and is less than 1; for spatial and orientation coordinates outside the attraction domain with V(z)>1, terminating the programmed path and generating notification.

System for monitoring stability of operation of autonomous robots

System for monitoring stability of autonomous robot, including a GNSS navigation receiver including antenna, analog front end, plurality of channels, inertial measurement unit (IMU) and a processor, all generating navigation and orientation data for the robot; based on the navigation and orientation data, calculating position and direction of movement for the robot; calculating spatial and orientation coordinates z.sub.1, z.sub.2 of the robot, relating to the position and direction of movement; continuing with programmed path for the robot for any spatial and orientation coordinates z.sub.1, z.sub.2 within an attraction domain, where a measure V(z) of distance from zero in z.sub.1, z.sub.2 plane are defined by Lurie-Postnikov functions and is less than 1; for spatial and orientation coordinates outside the attraction domain with V(z)>1, terminating the programmed path and generating notification.