G05D1/00

L3-level auto-emergency light system for ego vehicle harsh brake
11577644 · 2023-02-14 · ·

In one embodiment, a method, apparatus, and system for automatically switching on an emergency light at an autonomous driving vehicle (ADV) is disclosed. A present speed of an ADV at a first time instant is determined. A present deceleration of the ADV at the first time instant is determined. Whether the present speed satisfies a present speed condition and whether the present deceleration satisfies a present deceleration condition at the first time instant are determined. In response to determining that the present speed satisfies the present speed condition and that the present deceleration satisfies the present deceleration condition, whether a recent deceleration history of the ADV satisfies a recent deceleration history condition and whether an expected deceleration of the ADV satisfies an expected deceleration condition are determined. If either condition is satisfied, an emergency light of the ADV is automatically switched on.

System and method for assisting collaborative sensor calibration
11579632 · 2023-02-14 · ·

Embodiments described herein include a method of receiving, by a moving assisting vehicle, a calibration assistance request related to a moving ego vehicle that requested assistance in collaborative calibration of a sensor deployed on the moving ego vehicle. The method further includes analyzing the calibration assistance request to extract at least one of a schedule or an assistance route associated with the requested assistance. The method includes communicating with the moving ego vehicle about a desired location relative to the position of the moving ego vehicle for the moving assisting vehicle to be in order to assist the sensor to acquire information of a target present on the moving assisting vehicle. The method includes facilitating to drive the moving assisting vehicle to reach the desired location to achieve the collaborative calibration of the sensor on the moving ego vehicle.

Systems and methods for utilizing images to determine the position and orientation of a vehicle

Described are systems and methods to utilize images to determine the position and/or orientation of a vehicle (e.g., an autonomous ground vehicle) operating in an unstructured environment (e.g., environments such as sidewalks which are typically absent lane markings, road markings, etc.). The described systems and methods can determine the vehicle's position and orientation based on an alignment of annotated images captured during operation of the vehicle with a known annotated reference map. The translation and rotation applied to obtain alignment of the annotated images with the known annotated reference map can provide the position and the orientation of the vehicle.

Systems and methods for hybrid prediction framework with inductive bias

Systems and methods are provided for implementing hybrid prediction. Hybrid prediction integrates two deep learning based trajectory prediction approaches: grid-based approaches and graph-based approaches. Hybrid prediction techniques can achieve enhanced performance by combining the grid and graph approaches in a manner that incorporates appropriate inductive biases for different elements of a high-dimensional space. A hybrid prediction framework processor can generate trajectory predictions relating to movement of agents in a surrounding environment based on a prediction model generating using hybrid prediction. Trajectory predictions output from the hybrid prediction framework processor can be used to control an autonomous vehicle. For example, the autonomous vehicle can perform safety-aware and autonomous operations to avoid oncoming objects, based on the trajectory predictions.

Transferring data from autonomous vehicles
11580687 · 2023-02-14 · ·

A system includes at least one imaging sensor and a processor. The processor is configured to acquire, using the imaging sensor, detected data describing an environment of an autonomous vehicle. The processor is further configured to derive reference data, which describe the environment, from a predefined map, to compute difference data representing a difference between the detected data and the reference data, and to transfer the difference data. Other embodiments are also described.

System and method for interception and countering unmanned aerial vehicles (UAVS)

Systems, devices, and methods for identifying a target aerial vehicle, deploying an interceptor aerial vehicle comprising at least one effector, maneuvering the interceptor aerial vehicle to a position to engage a target aerial vehicle, deploying the at least one effector to intercept the target aerial vehicle, and confirming that the target aerial vehicle has been intercepted.

Autonomous driving instructions
11578986 · 2023-02-14 · ·

Current data for a geographic area is accessed. The current data comprises at least one of (a) current traffic data for the geographic area, (b) current incident data for the geographic area, or (c) current weather data for the geographic area. Based on the current data, autonomous driving instructions are determined for the geographic area. A notification comprising the autonomous driving instructions is provided such that the notification is received by a vehicle apparatus located within the geographic area or expected to enter the geographic area based on a route being traversed by a vehicle corresponding to the vehicle apparatus. The vehicle apparatus is onboard the vehicle and is configured to control the vehicle in accordance with the autonomous driving instructions.

Apparatus and method for safety improvement by collaborative autonomous vehicles
11580857 · 2023-02-14 · ·

An apparatus for safety collaboration in computer-assisted or autonomous driving (CA/AD) vehicles includes an input interface to obtain sensor data from one or more sensors of a CA/AD vehicle, an output interface, and an analyzer coupled to the input and output interfaces to process the sensor data to identify an emergency condition of the CA/AD vehicle, and in response to the identified emergency condition, cause a communication interface of the CA/AD vehicle, via the output interface, to broadcast a request for assistance to be received by one or more nearby CA/AD vehicles. The apparatus may be disposed in the CA/AD vehicle.

Autonomously acting robot whose activity amount is controlled
11579617 · 2023-02-14 · ·

A robot includes an operation control unit that selects a motion of the robot, a drive mechanism that executes a motion selected by the operation control unit, and a remaining battery charge monitoring unit that monitors a remaining charge of a rechargeable battery. Behavioral characteristics of the robot change in accordance with the remaining battery charge. For example, a motion with a small processing load is selected at a probability that is higher the smaller the remaining battery charge. Referring to consumption plan data that define a power consumption pace of the rechargeable battery, the behavioral characteristics of the robot may be caused to change in accordance with a difference between the remaining battery charge scheduled in the consumption plan data and the actual remaining battery charge.

Vehicle control apparatus, vehicle control method, vehicle, and storage medium
11577760 · 2023-02-14 · ·

A vehicle control apparatus comprises a first detection unit configured to have a first detection range, a second detection unit configured to have a second detection range which at least partially overlaps the first detection range, and a vehicle control unit configured to be capable of performing vehicle control based on a first control state and vehicle control based on a second control state which has a high vehicle control automation rate or a reduced degree of vehicle operation participation requested to a driver compared to the first control state. The vehicle control unit performs control to shift from the first control state to the second control state based on a condition that a match degree between pieces of preceding object information of a vehicle detected by the first detection unit and the second detection unit.