Patent classifications
G01S2013/93272
WORK MACHINE AND METHOD FOR CONTROLLING WORK MACHINE
The wheel loader (10) includes a vehicle main body (1), a rear detection section (71), a vehicle main body angle sensor (72), and a controller (26). The rear detection section (71) detects an obstacle (5) in the rear of the vehicle main body (1). The vehicle main body angle sensor (72) detects the inclination state of the vehicle main body (1). The controller (26) determines the control corresponding to the detection by the rear detection section (71), based on the inclination state of the vehicle main body (1) detected by the vehicle body angle sensor (72).
Automatic autonomous vehicle and robot LiDAR-camera extrinsic calibration
Extrinsic calibration of a Light Detection and Ranging (LiDAR) sensor and a camera can comprise constructing a first plurality of reconstructed calibration targets in a three-dimensional space based on physical calibration targets detected from input from the LiDAR and a second plurality of reconstructed calibration targets in the three-dimensional space based on physical calibration targets detected from input from the camera. Reconstructed calibration targets in the first and second plurality of reconstructed calibration targets can be matched and a six-degree of freedom rigid body transformation of the LiDAR and camera can be computed based on the matched reconstructed calibration targets. A projection of the LiDAR to the camera can be computed based on the computed six-degree of freedom rigid body transformation.
SENSOR LAYOUT OF VEHICLES
The present disclosure relates to a vehicle. The vehicle includes a first set of cameras, including a first subset of cameras facing to the front of the vehicle; a second set of cameras, with focal lengths less than those of the first set of cameras, the second set of cameras including a second and a third subset of cameras, the second subset of cameras facing to the front of the vehicle, and third subset of cameras facing to a side front and/or a side of the vehicle; and a third set of cameras, with focal lengths less than those of the second set of cameras, the third set of cameras including a fourth and a fifth subset of camera, the fourth subset of cameras facing to the front of the vehicle, and the fifth subset of camera facing to the side front and/or side of the vehicle.
Motor vehicle and method for a 360° detection of the motor vehicle surroundings
The invention relates to a method and a motor vehicle comprising a sensor assembly for a 360° detection of the motor vehicle surroundings. The sensor assembly has multiple sensors of the same type, wherein each of the multiple sensors has a specified detection region and the sensors are distributed around the exterior of the motor vehicle such that the detection regions collectively provide a complete detection zone which covers the surroundings in a complete angle about the motor vehicle at a specified distance from the motor vehicle. The sensors are each designed to detect the surroundings in their respective detection region as respective sensor data in respective successive synchronized time increments. The sensor assembly has a pre-processing mechanism which fuses the sensor data of each of the sensors in order to generate a three-dimensional image of the surroundings for a respective identical time increment and provides same in a common database.
Hybrid Sparse Subarray Design For Four-Dimensional Imaging Radar
Two-dimensional DOA estimation is challenging as the computational and hardware complexity could scale as the square as compared to that of one-dimensional problem. The proposed scheme relies on designing antenna locations and also involves a mix of subarray and digital beamforming to lower the overall system performance and cost by reducing the costly transceiver chains.
This framework proposes a two-step solution which first isolates a target to a given range doppler bin and elevation angle by linear receive subarray in the elevation direction. However, the elevation estimate is relatively coarse which is further refined along with a high-resolution estimate of azimuth angle. This is achieved by processing the received data from a 2D sparse antenna array, which are systematically chosen to maximize the resolution in both directions. The compressive sensing algorithm is applied to the 2D sparse received array data which exploits the sparse representation of the underlying signal support. The propose approach successfully pairs the correct elevation and azimuth angles for multiple targets. The methodology is effective for a case of single data snapshot and algorithm performance scale well with the availability of multiple data snapshots. It is noted that the proposed methodology allows to further increase the system resolution when data is processed with MIMO virtual array processing.
Driver assistance system and method
A driver assistance system and method are disclosed. The driver assistance system includes a first sensor installed at a vehicle and configured to have a field of view directed forward from the vehicle to acquire front image data, a second sensor selected from a group of radar and LIDAR sensors, installed at the vehicle, and configured to have a field of view directed forward from the vehicle to acquire front detection data, and a controller having a processor configured to process the front image data and the front detection data, wherein the controller is configured to detect a lane, in which the vehicle is traveling, or detect a front object located in front of the vehicle, in response to the processing of the image data and the front detection data, output a braking signal to a braking system of the vehicle when a collision between the vehicle and the front object is expected, and output a steering signal to a steering system of the vehicle when a collision between the vehicle and the front object is expected even with braking control.
RADAR SYSTEM AND ASSEMBLY
A non-contact object and/or gesture detection system includes at least one sensor configured to sense an object or motion within a field of view (FOV) using radio frequency radiation. Various sensor and brackets are provided which may allow a position and/or tilt of the sensor to be adjusted for controlling the FOV. A sensor housing includes a vent filter that breathable but impermeable to liquids. Various antenna designs are provided to provide desired FOV sizes and shapes, particularly for optimizing a radiation pattern that is relatively wide and shallow. A steerable antenna layout is also provided for controlling the location of the FOV without an adjustable bracket. A sensor housing including a projector mount for an icon projector is provided. A seal prevents debris from entering between the antenna and the bumper.
Synchronizing vehicle devices over a controller area network
A method for synchronizing devices in a vehicle may make use of the Controller Area Network (CAN) communication bus. A bus interface of each of two or more devices coupled to the bus may be configured to accept a same message broadcast via the communication bus, in which the message has a specific message identification (ID) header. A message may be received from the communication bus that has the specific message ID simultaneously by each of the two or more devices. Operation of the two or more devices may be synchronized by triggering a task on each of the two or more devices in response to receiving the message having the specific message ID.
VELOCITY REGRESSION SAFETY SYSTEM
Techniques for accurately predicting and avoiding collisions with objects detected in an environment of a vehicle are discussed herein. A vehicle safety system can implement a model to output data indicating an intersection probability between the object and a portion of the vehicle in the future. The model may employ a rear collision filter, a distance filter, and a time to stop filter to determine whether a predicted collision may be a false positive, in which case the techniques may include refraining from reporting such predicted collision to other another vehicle computing device to control the vehicle.
VEHICLE CONTROL SYSTEM FOR DETECTING OBJECT AND METHOD THEREOF
A vehicle control system may include a controller that detects an object outside a vehicle, calculates an angle based on a ratio of a relative speed between the object and the vehicle to a speed of the vehicle, and updates a phase curve reflecting a phase distortion of an input signal based on the calculated angle.