G05D1/0088

Hyper planning based on object and/or region

A vehicle computing system may implement techniques to predict behavior of objects detected by a vehicle operating in the environment. The techniques may include determining a feature with respect to a detected objects (e.g., likelihood that the detected object will impact operation of the vehicle) and/or a location of the vehicle and determining based on the feature a model to use to predict behavior (e.g., estimated states) of proximate objects (e.g., the detected object). The model may be configured to use one or more algorithms, classifiers, and/or computational resources to predict the behavior. Different models may be used to predict behavior of different objects and/or regions in the environment. Each model may receive sensor data as an input, and output predicted behavior for the detected object. Based on the predicted behavior of the object, a vehicle computing system may control operation of the vehicle.

System and method for automated flight correction in an electric aircraft
11577822 · 2023-02-14 · ·

A system for automated flight correction is an electric aircraft that includes an input control configured to receive a user input and generate a control datum as a function of the user input, at least a sensor connected to the electric aircraft that is configured to detect a status datum and transmit the status datum to a flight controller, a flight controller communicatively connected to the input control and the at least a sensor configured to receive the control datum from the input control, receive the status datum from the flight controller and determine a command datum as a function of the control datum and status datum, and an actuator connected to the flight controller configured to receive the command datum from the flight controller and command at least a flight component as a function of the command datum.

Method and apparatus for de-biasing the detection and labeling of objects of interest in an environment
11579625 · 2023-02-14 · ·

Described herein are methods of generating learning data to facilitate de-biasing the labeled location of an object of interest within an image. Methods may include: receiving sensor data, where the sensor data is a first image; determining reference corner locations of an object in the first image using image processing; generating observed corner locations of the object in the first image from the determined reference corner locations; generating a bias transformation based, at least in part, on a difference between the reference corner locations and the observed corner locations of the object in the first image; receiving sensor data from another image sensor of a second image; receiving observed corner locations of an object in the second image from a user; and applying the bias transformation to the observed corner locations of the object in the second image to generate de-biased corners for the object in the second image.

Systems and methods for delivering products via autonomous ground vehicles to restricted areas designated by customers

In some embodiments, methods and systems are provided that provide for facilitating delivery, via autonomous ground vehicles, of products ordered by customers of a retailer to customer-specified restricted areas accessible by an entryway openable via an access code.

Identifying a route for an autonomous vehicle between an origin and destination location

Described herein are technologies relating to computing a likelihood of an operation-influencing event with respect to an autonomous vehicle at a geographic location. The likelihood of the operation-influencing event is computed based upon a prediction of a value that indicates whether, through a causal process, the operation-influencing event is expected to occur. The causal process is identified by means of a model, which relates spatiotemporal factors and the operation-influencing events.

System and method for removing debris from a storage facility

Autonomous carriers or totes that include vacuum units are provided. As the totes move or are moved through a warehouse carrying products, they collect debris. The debris can be analyzed at the tote, and actions can be performed based upon the analysis.

Temporal information prediction in autonomous machine applications

In various examples, a sequential deep neural network (DNN) may be trained using ground truth data generated by correlating (e.g., by cross-sensor fusion) sensor data with image data representative of a sequences of images. In deployment, the sequential DNN may leverage the sensor correlation to compute various predictions using image data alone. The predictions may include velocities, in world space, of objects in fields of view of an ego-vehicle, current and future locations of the objects in image space, and/or a time-to-collision (TTC) between the objects and the ego-vehicle. These predictions may be used as part of a perception system for understanding and reacting to a current physical environment of the ego-vehicle.

Autonomous vehicle park-and-go scenario design

In one embodiment, when an autonomous driving vehicle (ADV) is parked, the ADV can determine, based on criteria, whether to operate in an open-space mode or an on-lane mode. The criteria can include whether the ADV is within a threshold distance and threshold heading relative to a vehicle lane. If the criteria are not satisfied, then the ADV can enter the open-space mode. While in the open-space mode, the ADV can maneuver it is within the threshold distance and the threshold heading relative to the vehicle lane. In response to the criteria being satisfied, the ADV can enter and operate in the on-lane mode for the ADV to resume along the vehicle lane.

Managing redundant steering system for autonomous vehicles
11577776 · 2023-02-14 · ·

Techniques are described for managing redundant steering system for a vehicle. A method includes sending a first control command that instructs a first motor coupled to a steering wheel in a steering system to steer a vehicle, receiving, after sending the first control command, a speed of the vehicle, a yaw rate of the vehicle, and a steering position of the steering wheel, determining, based at least on the speed and the yaw rate, an expected range of steering angles that describes values within which the first motor is expected to steer the vehicle based on the first control command, and upon determining that the steering position of a steering wheel is outside the expected range of steering angles, sending a second control command that instructs a second motor coupled to the steering wheel in the steering system to steer the vehicle.

Parking control system for autonomous vehicle

A parking control system for an autonomous vehicle is provided. The parking control system includes a parking control device configured to monitor a location and movement of an autonomous vehicle which enters a parking lot, based on a 3D electronic map, calculate a driving trajectory to a parking space selected by a driver of the autonomous vehicle based on sensor data collected from various sensor in the parking lot and vehicle information received from the autonomous vehicle, and provide information about the calculated driving trajectory to the autonomous vehicle. The autonomous vehicle travels to the parking space based on the driving trajectory received from the parking control device and parks in the parking space.