Patent classifications
G01S17/93
EXTERNAL ENVIRONMENT SENSOR DATA PRIORITIZATION FOR AUTONOMOUS VEHICLE
An autonomous vehicle includes an array of sensors, a processor, and a switch. The array of sensors generate sensor data related to one or more objects in an external environment of the autonomous vehicle and the processor determines an environmental context. The switch transfers the sensor data from the array of sensors to the processor, where the switch is configured to: (a) receive first sensor data from a first sensor group of the array of sensors; (b) receive second sensor data from a second sensor group of the array of sensors; (c) determine an order of transmission of the first sensor data over the second sensor data in response to the environmental context; and (d) transmit the first sensor data to the processor prior to transmitting the second sensor data based on the order of transmission.
Emitter device for an optical detection apparatus, optical detection apparatus, motor vehicle and method
The invention relates to a emitter device (8) for an optical detection apparatus (3) of a motor vehicle (1), which is designed to scan a surrounding region (4) of the motor vehicle (1) by means of a light beam (10), and which comprises a light source (13) for emitting the light beam (10) and a deflection unit (15), wherein the deflection unit (15) is designed to deflect the light beam (10) emitted onto the deflection unit (15) by the light source (13) at different scanning angles (α), wherein the deflection unit (15) comprises a freeform mirror (19). The freeform mirror (19) comprises at least two surface elements (20a, 20b) having different angles of inclination (21a, 21b) and is designed to reflect the light beam (10) in order to generate a predetermined setpoint field of view (16) of the emitter device (8) at predetermined setpoint values (−α3, −α2, −α1, α0, +α1, +α2, +α3) for the scanning angle (α), said setpoint values corresponding to the angles of inclination (21a, 21b). The invention additionally relates to an optical detection apparatus (3), a motor vehicle (1) comprising at least one optical detection apparatus (3), and to a method for generating a setpoint field of view (16) for an emitter device (8) of an optical detection apparatus (3) of a motor vehicle (1).
System and method for assisting collaborative sensor calibration
Embodiments described herein include a method of receiving, by a moving assisting vehicle, a calibration assistance request related to a moving ego vehicle that requested assistance in collaborative calibration of a sensor deployed on the moving ego vehicle. The method further includes analyzing the calibration assistance request to extract at least one of a schedule or an assistance route associated with the requested assistance. The method includes communicating with the moving ego vehicle about a desired location relative to the position of the moving ego vehicle for the moving assisting vehicle to be in order to assist the sensor to acquire information of a target present on the moving assisting vehicle. The method includes facilitating to drive the moving assisting vehicle to reach the desired location to achieve the collaborative calibration of the sensor on the moving ego vehicle.
External environment sensor data prioritization for autonomous vehicle
Sensor data is received from an array of sensors configured to capture one or more objects in an external environment of an autonomous vehicle. A first sensor group is selected from the array of sensors based on proximity data or environmental contexts. First sensor data from the first sensor group is prioritized for transmission based on the proximity data or environmental contexts.
External environment sensor data prioritization for autonomous vehicle
Sensor data is received from an array of sensors configured to capture one or more objects in an external environment of an autonomous vehicle. A first sensor group is selected from the array of sensors based on proximity data or environmental contexts. First sensor data from the first sensor group is prioritized for transmission based on the proximity data or environmental contexts.
Object capturing device, capture target, and object capturing system
An object capturing device includes light emission, receiving, and scanning units, and distance calculation, and object determination units. The scanning unit measures light from the emission unit to head toward a measurement target space to perform scanning, and to guide reflected light from the object with respect to the measurement light to the receiving unit. The distance calculation unit calculates a distance to the object in association with a scanning angle of the scanning unit. The object determination unit determines whether the object is a capture target based on whether a scanning angle range within which a difference between distances is equal to or less than a predetermined threshold value corresponding to a reference scanning angle range of the capture target, and a determination of whether intensity distribution of the reflected light within the scanning angle range corresponds to reference intensity distribution of the reflected light from the capture target.
Object capturing device, capture target, and object capturing system
An object capturing device includes light emission, receiving, and scanning units, and distance calculation, and object determination units. The scanning unit measures light from the emission unit to head toward a measurement target space to perform scanning, and to guide reflected light from the object with respect to the measurement light to the receiving unit. The distance calculation unit calculates a distance to the object in association with a scanning angle of the scanning unit. The object determination unit determines whether the object is a capture target based on whether a scanning angle range within which a difference between distances is equal to or less than a predetermined threshold value corresponding to a reference scanning angle range of the capture target, and a determination of whether intensity distribution of the reflected light within the scanning angle range corresponds to reference intensity distribution of the reflected light from the capture target.
ENVIRONMENTAL SENSING DEVICE AND INFORMATION ACQUIRING METHOD APPLIED TO ENVIRONMENTAL SENSING DEVICE
Disclosed are embodiments of environmental sensing devices and information acquiring methods applied to environmental sensing devices. In some embodiments, an environmental sensing device includes a camera sensor, a laser radar sensor that are integrated, and a control unit. The control unit is connected simultaneously to the camera sensor and the laser radar sensor. The control unit is used for simultaneously entering a trigger signal to the camera sensor and the laser radar sensor. The design of integrating the camera sensor and the laser radar sensor avoids the problems such as poor contact and noise generation that easily occur in a high-vibration and high-interference vehicle environment, and can precisely trigger the camera sensor and the laser radar sensor simultaneously, so as to obtain high-quality fused data, thereby improving the accuracy of environmental sensing. As a result, the camera sensor and the laser radar sensor have a consistent overlapping field of view.
REFRACTIVE BEAM STEERING DEVICE USEFUL FOR AUTOMATED VEHICLE LIDAR
An illustrative example device for steering a beam of radiation includes at least one compressible optic component including at least one lens in a compressible optic material adjacent the lens. An actuator controls an orientation of the lens by selectively applying pressure on the compressible optic material.
LOW-PROFILE IMAGING SYSTEM WITH ENHANCED VIEWING ANGLES
Methods, devices, and systems of a light imaging and ranging system are provided. In particular, the imaging and ranging system includes a LIDAR sensor and a low-profile optics assembly having a reflective element with a continuous and uninterrupted reflective surface surrounding a periphery of a LIDAR sensor in a light path of the LIDAR sensor. The reflective element is positioned at a distance offset from the periphery of the LIDAR sensor and directs light emitted by the LIDAR sensor to a second reflective element that is substantially similar in shape and size as the reflective element. The second reflective element is arranged above and opposite the reflective element directing the light emitted by the LIDAR sensor to a sensing environment outside the imaging and ranging system.