Patent classifications
G01S7/497
USER-IN-THE-LOOP OBJECT DETECTION AND CLASSIFICATION SYSTEMS AND METHODS
A detection device is adapted to traverse a search area and generate sensor data associated with an object that may be present in the search area, the detection device comprising a first logic device configured to detect and classify the object in the sensor data, communicate object detection information to a control system when the detection device is within a range of communications of the control system, and generate and store object analysis information for a user of the control system when the detection device is not in communication with the control system. A control system facilitates user monitoring and/or control of the detection device during operation and to access the stored object analysis information. The object analysis information is provided in an interactive display to facilitate user detection and classification of the detected object by the user to update the detection information, trained object classifier, and training dataset.
IMAGE PROCESSING DEVICE, IMAGER, INFORMATION PROCESSING DEVICE, DETECTOR, ROADSIDE UNIT, IMAGE PROCESSING METHOD, AND CALIBRATION METHOD
An image processing device 10 includes an image interface 18, a memory 19, and a controller 20. The image interface 18 acquires a captured image. The positions of specific feature points in a world coordinate system and reference positions of the specific feature points are stored in the memory 19. The controller 20 detects the specific feature points in the captured image. In a case where discrepancy between the position in the captured image and the reference position is found with regard to a predetermined percentage or more of the specific feature points, the controller 20 recalculates a calibration parameter.
IMAGE PROCESSING DEVICE, IMAGER, INFORMATION PROCESSING DEVICE, DETECTOR, ROADSIDE UNIT, IMAGE PROCESSING METHOD, AND CALIBRATION METHOD
An image processing device 10 includes an image interface 18, a memory 19, and a controller 20. The image interface 18 acquires a captured image. The positions of specific feature points in a world coordinate system and reference positions of the specific feature points are stored in the memory 19. The controller 20 detects the specific feature points in the captured image. In a case where discrepancy between the position in the captured image and the reference position is found with regard to a predetermined percentage or more of the specific feature points, the controller 20 recalculates a calibration parameter.
Absolute distance measurement for time-of-flight sensors
A time-of-flight (TOF) sensor device includes: an illumination component that emitting a light beam toward a viewing space; a receiving lens element receiving reflected light and directing the reflected light to a photo-receiver array; and a processor. The processor is configured to generate distance information for a pixel corresponding to an object in the viewing space based on time-of-flight analysis of the reflected light; record a variation of an intensity of the reflected light from the object over time to yield intensity variation information; record a variation of the distance information for the pixel corresponding to the object over time to yield distance variation information; and apply a correction factor to the distance information in response to a determination that the intensity variation information and the distance variation information do not conform to an inverse-square relationship.
Absolute distance measurement for time-of-flight sensors
A time-of-flight (TOF) sensor device includes: an illumination component that emitting a light beam toward a viewing space; a receiving lens element receiving reflected light and directing the reflected light to a photo-receiver array; and a processor. The processor is configured to generate distance information for a pixel corresponding to an object in the viewing space based on time-of-flight analysis of the reflected light; record a variation of an intensity of the reflected light from the object over time to yield intensity variation information; record a variation of the distance information for the pixel corresponding to the object over time to yield distance variation information; and apply a correction factor to the distance information in response to a determination that the intensity variation information and the distance variation information do not conform to an inverse-square relationship.
Motorized Mounting Device for Positioning an Optical Element Within a Field-of-View of an Optical Sensor and Method of Use
A mounting device for selectively positioning an optical element within a field-of-view of an optical sensor of a vehicle includes: a housing defining an opening sized to fit over an aperture of the optical sensor; a holder for the optical element connected to the housing and positioned such that, when the holder is in a first position, the optical element is at least partially within the field-of-view of the optical sensor; and a motorized actuator. The motorized actuator can be configured to move the holder to adjust the position of the optical element relative to the field-of-view of the optical sensor.
Motorized Mounting Device for Positioning an Optical Element Within a Field-of-View of an Optical Sensor and Method of Use
A mounting device for selectively positioning an optical element within a field-of-view of an optical sensor of a vehicle includes: a housing defining an opening sized to fit over an aperture of the optical sensor; a holder for the optical element connected to the housing and positioned such that, when the holder is in a first position, the optical element is at least partially within the field-of-view of the optical sensor; and a motorized actuator. The motorized actuator can be configured to move the holder to adjust the position of the optical element relative to the field-of-view of the optical sensor.
LIDAR SYSTEM WITH PULSE-ENERGY MEASUREMENT
A system includes a light source, an optical splitter, and a pulse-energy measurement circuit. The light source is configured to generate an emitted beam of light that includes an emitted pulse of light. The optical splitter is configured to split the emitted beam of light to produce at least (i) a test beam of light that includes a test pulse of light, the test pulse of light including a first portion of the emitted pulse of light and (ii) an output beam of light that includes an output pulse of light, the output pulse of light including a second portion of the emitted pulse of light allowed to at least in part exit the system. The pulse-energy measurement circuit is configured to receive the test pulse of light and determine a numerical value corresponding to an individual energy amount of the test pulse of light.
Lidar device
A LIDAR device, including a housing, and an emitter device that is situated rotatably about a rotation axis and that is designed in such a way that the measuring beams of the emitter device intersect in the area of an exit aperture of the LIDAR device.
AUTOMATIC EXTRINSIC CALIBRATION USING SENSED DATA AS A TARGET
Provided are systems and methods for auto calibrating a vehicle using a calibration target that is generated from the vehicle's sensor data. In one example, the method may include receiving sensor data associated with a road captured by one or more sensors of a vehicle, identifying lane line data points within the sensor data, generating a representation which includes positions of a plurality of lane lines of the road based on the identified lane line data points, and adjusting a calibration parameter of a sensor from among the one or more sensors of the vehicle based on the representation of the plurality of lane lines.