Patent classifications
G01S17/87
IMAGE PROCESSING DEVICE, IMAGER, INFORMATION PROCESSING DEVICE, DETECTOR, ROADSIDE UNIT, IMAGE PROCESSING METHOD, AND CALIBRATION METHOD
An image processing device 10 includes an image interface 18, a memory 19, and a controller 20. The image interface 18 acquires a captured image. The positions of specific feature points in a world coordinate system and reference positions of the specific feature points are stored in the memory 19. The controller 20 detects the specific feature points in the captured image. In a case where discrepancy between the position in the captured image and the reference position is found with regard to a predetermined percentage or more of the specific feature points, the controller 20 recalculates a calibration parameter.
IMAGE PROCESSING DEVICE, IMAGER, INFORMATION PROCESSING DEVICE, DETECTOR, ROADSIDE UNIT, IMAGE PROCESSING METHOD, AND CALIBRATION METHOD
An image processing device 10 includes an image interface 18, a memory 19, and a controller 20. The image interface 18 acquires a captured image. The positions of specific feature points in a world coordinate system and reference positions of the specific feature points are stored in the memory 19. The controller 20 detects the specific feature points in the captured image. In a case where discrepancy between the position in the captured image and the reference position is found with regard to a predetermined percentage or more of the specific feature points, the controller 20 recalculates a calibration parameter.
Absolute distance measurement for time-of-flight sensors
A time-of-flight (TOF) sensor device includes: an illumination component that emitting a light beam toward a viewing space; a receiving lens element receiving reflected light and directing the reflected light to a photo-receiver array; and a processor. The processor is configured to generate distance information for a pixel corresponding to an object in the viewing space based on time-of-flight analysis of the reflected light; record a variation of an intensity of the reflected light from the object over time to yield intensity variation information; record a variation of the distance information for the pixel corresponding to the object over time to yield distance variation information; and apply a correction factor to the distance information in response to a determination that the intensity variation information and the distance variation information do not conform to an inverse-square relationship.
Absolute distance measurement for time-of-flight sensors
A time-of-flight (TOF) sensor device includes: an illumination component that emitting a light beam toward a viewing space; a receiving lens element receiving reflected light and directing the reflected light to a photo-receiver array; and a processor. The processor is configured to generate distance information for a pixel corresponding to an object in the viewing space based on time-of-flight analysis of the reflected light; record a variation of an intensity of the reflected light from the object over time to yield intensity variation information; record a variation of the distance information for the pixel corresponding to the object over time to yield distance variation information; and apply a correction factor to the distance information in response to a determination that the intensity variation information and the distance variation information do not conform to an inverse-square relationship.
CONTROL METHOD OF AN APPARATUS FOR ACTIVATING ONE OR MORE FUNCTIONALITIES OF THE SAME
A control method of an apparatus is provided. The apparatus includes a control unit coupled to a proximity sensor to detect a first distance of a user in a field of view, and coupled to a charge variation sensor to detect an electric/electrostatic charge variation caused by the user in a detection region. The control method includes acquiring a charge variation signal and generating charge variation parameters as a function of the charge variation signal. The control method further includes determining whether a condition on charge variation parameters is verified, and if the condition on charge variation parameters is verified, activating the proximity sensor and acquiring a proximity signal. Proximity parameters are generated as a function of the proximity signal. If a condition on proximity parameters is verified, one or more functionalities of the apparatus are activated.
FOREIGN OBJECT DEBRIS DISCRIMINATION WITH MODULATED LASER LIGHT
A method of foreign object debris discrimination includes illuminating a particle located within a sensing volume with a modulated electromagnetic radiation pulse emitted from a source; receiving one or more electromagnetic radiation return signals that have been scattered by the particle illuminated by the modulated electromagnetic radiation pulse at a detector; mixing, using a controller, the electromagnetic radiation return signal of amplitude I.sub.RS and frequency f.sub.RS with a reference signal of amplitude I.sub.LS and frequency f.sub.RS; analyzing, using the controller, an amplitude of the mixed signal √{square root over (I.sub.LSI.sub.RS)}, and frequency of the mixed signal, f.sub.RS−f.sub.LS; and classifying, using the controller, a particle position, a velocity, and electromagnetic characteristic of the particle based on the amplitude, √{square root over (I.sub.LSI.sub.RS)}, and frequency, f.sub.RS−f.sub.LS of the mixed signal.
FINGERTIP LIDAR SYSTEM FOR VISUAL ASSISTANCE
Method and apparatus for using light detection and ranging (LiDAR) to assist the visually impaired. In some embodiments, a LiDAR system is affixed to a selected finger of a user and used to emit light pulses within a field of view (FoV) before the user. Reflected pulses are detected to generate a point cloud representation of the FoV. A sensory input is provided to the user that describes the point cloud representation of the FoV. The sensory input may be haptic, auditory or some other form. In some cases, a glove is worn by the user and a separate LiDAR system is affixed to each finger portion of the glove to provide a composite scanning and detection operation. Preconfigured hand gestures by the user can be used to change the operational configuration of the system.
OPTIMIZED MULTICHANNEL OPTICAL SYSTEM FOR LIDAR SENSORS
The subject matter of this specification can be implemented in, among other things, systems and methods of optical sensing that utilize optimized processing of multiple sensing channels for efficient and reliable scanning of environments. The optical sensing includes multiple optical communication lines that include coupling portions configured to facilitate efficient collection of various received beams. The optical sensing system further includes multiple light detectors configured to process collected beams and produce data representative of a velocity of an object that generated the received beam and/or a distance to that object.
MOBILE SYSTEM AND METHOD OF SCANNING AN ENVIRONMENT
A system and method for measuring three-dimensional (3D) coordinate values of an environment is provided. The system includes a movable base unit a first scanner and a second scanner. One or more processors performing a method that includes causing the first scanner to determine first plurality of coordinate values in a first frame of reference based at least in part on a measurement by at least one sensor. The second scanner determines a second plurality of 3D coordinate values in a second frame of reference as the base unit is moved from a first position to a second position. The determining of the first coordinate values and the second plurality of 3D coordinate values being performed simultaneously. The second plurality of 3D coordinate values are registered in a common frame of reference based on the first plurality of coordinate values.
SITUATIONAL AWARENESS SYSTEM FOR AN AUTONOMOUS OR SEMI-AUTONOMOUS VEHICLE
A situational awareness system for a vehicle comprising a cyber-physical system, wherein the situational awareness system is configured to generate an imaging dataset for processing by the cyber-physical system for enabling semi-autonomous or autonomous operational mode of the vehicle, wherein the situational awareness system includes a sensory system with a first electro-optical unit for imaging the surroundings of the vehicle, a second electro-optical unit configured for imaging a ground area in a direct vicinity of the vehicle, a radar unit for detecting objects, and a third electro-optical unit for object identification, wherein the situational awareness system further includes a data synchronization system configured to synchronize the imaging dataset obtained by means of each unit of the sensory system, wherein the data synchronization system is configured to provide the synchronized imaging dataset to the cyber-physical system of the vehicle.