Patent classifications
G05D1/228
Vehicle adaptive crawl control
Systems and methods for using autonomous vehicle assistance for freeing a vehicle from a stuck condition may include: receiving sensor data from a vehicle sensor indicating a condition of the vehicle; determining from the sensor data that the vehicle is in a stuck condition; obtaining a solution for freeing the vehicle from the stuck condition, wherein the technique is a learned solution developed based on collected data related to vehicle operator techniques for extrication; and taking over at least partial control of the vehicle from its operator and applying the learned technique to the vehicle.
Auto clean machine and auto clean machine control method
An auto clean machine, comprising: a light source configured to emit light to illuminate at least one light region outside and in front of the auto clean machine; a first image sensing area, configured to sense a first brightness distribution of the light region; a second image sensing area below the first image sensing area, configured to sense a second brightness distribution of the light region; and a processor, configured to control movement of the auto clean machine according the first brightness distribution and the second brightness distribution. The processor generates a wall detection result based on the first brightness distribution of the light region, generates a cliff detection result based on the second brightness distribution of the light region, and controls the movement of the auto clean machine according to the wall detection result and the cliff detection result.
Fastest lane determination algorithm under traffic jam
A method, apparatus, and system for determining average lane travel speeds is disclosed. A plurality of vehicles traveling in a same direction as the ADV in a plurality of lanes are identified. Over a first time period, the plurality of vehicles is tracked. At least a first quantity of representative vehicles within the plurality of vehicles that are representative of vehicles traveling in the lane over the first time period are identified. For each of the plurality of lanes, an average speed over the first time period of the representative vehicles associated with the lane is determined. A trajectory is planned for the ADV, wherein the planned trajectory moves toward a lane whose representative vehicles have a fastest average speed. Thereafter, control signals are generated to control operations of the ADV based on the planned trajectory.
Autonomous vehicle system for determining a pullover spot in response to detected local failure
A method for determining a pullover spot for a vehicle is described. The method includes using a computing device to detect information related to a system of the vehicle or an environment surrounding the vehicle using a sensor of a vehicle and determine a local failure of the vehicle based on the information. The computing device may then be used to determine that the vehicle should pullover before completing a current trip related to transporting a passenger or good by comparing vehicle requirements for the trip with the local failure and determine a pullover spot by identifying a first area for the vehicle to park in part based on a second area being available for a second vehicle to pick up the passenger or good. The computing device may operate the vehicle to the pullover spot and transmit a request for a second vehicle.
Perimeter sensor housings
The technology relates to an exterior sensor system for a vehicle configured to operate in an autonomous driving mode. The technology includes a close-in sensing (CIS) camera system to address blind spots around the vehicle. The CIS system is used to detect objects within a few meters of the vehicle. Based on object classification, the system is able to make real-time driving decisions. Classification is enhanced by employing cameras in conjunction with lidar sensors. The specific arrangement of multiple sensors in a single sensor housing is also important to object detection and classification. Thus, the positioning of the sensors and support components are selected to avoid occlusion and to otherwise prevent interference between the various sensor housing elements.
Event-based connected vehicle control and response systems
Event-based connected vehicle control and response systems, methods, and apparatus are disclosed. An example method comprises identifying the occurrence of an event, storing first data corresponding to apparatus operation for a first threshold amount of time prior to the event, during the occurrence of the event, and for a second threshold amount of time after the event, determining whether a responsive object is involved in or near the event, in response to determining that the responsive object is involved in or near the event, transmitting the first data to the responsive object, and receiving, from the responsive object, second data, analyzing the first data and the second data to determine a party at-fault for the event, aggregating the first data and second data into an event report, and causing, automatically, a response to be initiated through an entity associated with the party at-fault.
MULTIMODAL MULTI-TECHNIQUE SIGNAL FUSION SYSTEM FOR AUTONOMOUS VEHICLE
An autonomous vehicle incorporating a multimodal multi-technique signal fusion system is described herein. The signal fusion system is configured to receive at least one sensor signal that is output by at least one sensor system (multimodal), such as at least one image sensor signal from at least one camera. The at least one sensor signal is provided to a plurality of object detector modules of different types (multi-technique), such as an absolute detector module and a relative activation detector module, that generate independent directives based on the at least one sensor signal. The independent directives are fused by a signal fusion module to output a fused directive for controlling the autonomous vehicle.
ADAPTIVE ILLUMINATION FOR A TIME-OF-FLIGHT CAMERA ON A VEHICLE
Disclosed are devices, systems and methods for capturing an image. In one aspect an electronic camera apparatus includes an image sensor with a plurality of pixel regions. The apparatus further includes an exposure controller. The exposure controller determines, for each of the plurality of pixel regions, a corresponding exposure duration and a corresponding exposure start time. Each pixel region begins to integrate incident light starting at the corresponding exposure start time and continues to integrate light for the corresponding exposure duration. In some example embodiments, at least two of the corresponding exposure durations or at least two of the corresponding exposure start times are different in the image.
BATTERY INSTALLATION SYSTEM, BATTERY INSTALLATION METHOD, AND PROGRAM
Remaining charge acquisition means of a battery installation system acquires remaining charge information on a remaining charge of each of a plurality of batteries which are installable in an unmanned aerial vehicle. Battery weight acquisition means acquires battery weight information on a weight of each battery. Location acquisition means acquires location information on a movement destination of the unmanned aerial vehicle. Selection means selects, based on the remaining charge information, the battery weight information, and the location information, from among the plurality of batteries, a battery having a remaining charge equal to or more than a battery consumption amount for moving to the movement destination. Processing execution means executes processing for installing the battery selected by the selection means in the unmanned aerial vehicle.
Event-based connected vehicle control and response systems
Event-based connected vehicle control and response systems, methods, and apparatus are disclosed. An example method comprises identifying the occurrence of an event, storing first data corresponding to apparatus operation for a first threshold amount of time prior to the event, during the occurrence of the event, and for a second threshold amount of time after the event, determining whether a responsive object is involved in or near the event, in response to determining that the responsive object is involved in or near the event, transmitting the first data to the responsive object, and receiving, from the responsive object, second data, analyzing the first data and the second data to determine a party at-fault for the event, aggregating the first data and second data into an event report, and causing, automatically, a response to be initiated through an entity associated with the party at-fault.