Patent classifications
B60W2420/403
NPU IMPLEMENTED FOR ARTIFICIAL NEURAL NETWORKS TO PROCESS FUSION OF HETEROGENEOUS DATA RECEIVED FROM HETEROGENEOUS SENSORS
A neural processing unit (NPU) includes a controller including a scheduler, the controller configured to receive from a compiler a machine code of an artificial neural network (ANN) including a fusion ANN, the machine code including data locality information of the fusion ANN, and receive heterogeneous sensor data from a plurality of sensors corresponding to the fusion ANN; at least one processing element configured to perform fusion operations of the fusion ANN including a convolution operation and at least one special function operation; a special function unit (SFU) configured to perform a special function operation of the fusion ANN; and an on-chip memory configured to store operation data of the fusion ANN, wherein the schedular is configured to control the at least one processing element and the on-chip memory such that all operations of the fusion ANN are processed in a predetermined sequence according to the data locality information.
IMAGE SEGMENTATION METHOD AND DEVICE
An electronic device extracts feature data from an input image, calculates one or more class maps from the feature data using a classifier layer, calculates one or more cluster maps from the feature data using a clustering layer, and generates image segmentation data using the one or more class maps and the one or more cluster maps.
Vehicular collision avoidance system
A vehicular collision avoidance system includes a forward-viewing camera, a rearward-viewing camera, a rearward-sensing non-vision sensor and an electronic control unit. The vehicular collision avoidance system detects vehicles present forward and/or rearward of the equipped vehicle. Responsive to at least one selected from the group consisting of (i) data processing of image data captured by the rearward-viewing camera and (ii) data processing of sensor data captured by the rearward-sensing non-vision sensor, the vehicular collision avoidance system detects another vehicle approaching the equipped vehicle from the rear, determines that the other vehicle is traveling in the same traffic lane as the equipped vehicle, determines speed difference between the vehicles, and determines distance from the equipped vehicle to the other vehicle. Based on such determinations, the system determines that impact with the equipped vehicle by the other vehicle is imminent.
METHOD AND SYSTEM FOR DEVELOPING AUTONOMOUS VEHICLE TRAINING SIMULATIONS
Method and systems for generating vehicle motion planning model simulation scenarios are disclosed. The system receives a base simulation scenario with features of a scene through which a vehicle may travel. The system then generates an augmentation element with a simulated behavior for an object in the scene by: (i) accessing a data store in which behavior probabilities are mapped to object types to retrieve a set of behavior probabilities for the object; and (ii) applying a randomization function to the behavior probabilities to select the simulated behavior for the object. The system will add the augmentation element to the base simulation scenario at the interaction zone to yield an augmented simulation scenario. The system will then apply the augmented simulation scenario to an autonomous vehicle motion planning model to train the motion planning model.
Method for a position determination of a vehicle, control unit, and vehicle
A method for a position determination of a vehicle, at least one camera and one sensor unit for a global satellite navigation system being situated on the vehicle. The method includes: acquiring at least one camera image of the environment of the vehicle with the aid of the camera, generating a transformed image as a function of the acquired camera image, with the transformed image having a virtual perspective pointing perpendicularly in the downward direction, determining a satellite position of the vehicle through a satellite-based position determination, and providing an aerial image of the environment of the vehicle as a function of the determined satellite position. A detection of a position of the transformed image in the supplied aerial image, and an ascertainment of a vehicle position as a function of the detected position of the transformed image in the supplied aerial image take place subsequently.
DIRECTED CONTROL TRANSFER WITH AUTONOMOUS VEHICLES
Techniques for cognitive analysis for directed control transfer with autonomous vehicles are described. In-vehicle sensors are used to collect cognitive state data for an individual within a vehicle which has an autonomous mode of operation. The cognitive state data includes infrared, facial, audio, or biosensor data. One or more processors analyze the cognitive state data collected from the individual to produce cognitive state information. The cognitive state information includes a subset or summary of cognitive state data, or an analysis of the cognitive state data. The individual is scored based on the cognitive state information to produce a cognitive scoring metric. A state of operation is determined for the vehicle. A condition of the individual is evaluated based on the cognitive scoring metric. Control is transferred between the vehicle and the individual based on the state of operation of the vehicle and the condition of the individual.
Work Vehicle
A work vehicle capable of traveling on a public road and in a working field includes: a public road traveling determination unit configured to generate vehicle body position information regarding a position at which the vehicle body is located, determine based on the vehicle body position information whether or not the vehicle body is traveling on the public road, and output a determination result; and a vehicle speed limiter configured to limit a vehicle speed in accordance with the determination result.
AUTOMOTIVE ELECTRONIC SYSTEM AND CONTROL METHOD THEREOF
An automotive electronic system includes a detection element, a first cable, and a control device. The detection element includes a first serializer. The first serializer supports a first signal mode. The control device includes a processor and a first deserializer. The processor generates a first control signal. The first deserializer is coupled through the first cable to the first serializer, and is switchable between a plurality of operation modes according to the first control mode. Responsive to the detection element, the first deserializer selects a first specific mode among the operation modes, and the first specific mode matches with the first signal mode.
Method and system for automated calibration of sensors
The invention relates to a method for automated calibration of sensors of a vehicle, wherein at least one first passive optical sensor and at least one second active optical sensor are calibrated by a calibration unit based on a matching spatial orientation of recognised environmental features in transformed sensor data of the first sensor and the sensor data captured by the second sensor.
Method and System for On-Demand Roadside AI Service
A method comprises receiving a service request from a vehicle, obtaining environment data with one or more sensors, determining a vehicle type of the vehicle based on the service request, determining service data responsive to the service request based on the vehicle type of the vehicle and the environment data, and transmitting a service message comprising the service data to the vehicle.