Patent classifications
G01S7/4865
Systems and methods for collaborative location tracking and sharing using augmented reality
Disclosed is a location tracking system and associated methods for precisely locating a target device with a recipient device via different forms of location tracking and augmented reality. The recipient device receives a first position of the target device over a data network. The recipient device is moved according to the first position until the target device is in Ultra-WideBand (“UWB”) signaling range of the recipient device. The recipient device then measures a distance and direction of the target device relative to the recipient device based on Time-of-Flight (“ToF”) measurements generated from the UWB signaling. The recipient device determines a second position of the target device based on the distance and direction of the target device, and generates an augmented reality view with a visual reference at a particular position in images of a captured scene that corresponds to the second position of the target device.
DISTANCE IMAGE ACQUISITION APPARATUS AND DISTANCE IMAGE ACQUISITION METHOD
Disclosed are a distance image acquisition apparatus and a distance image acquisition method capable of achieving high distance measurement accuracy and omitting wasteful imaging or calculation. The distance image acquisition apparatus (10) includes a distance image sensor (14), a drive mode setting unit (20A), a distance image generation unit (20B), a pulse light emission unit (22), and an exposure control unit (24). The exposure control unit (24) controls emission and non-emission of pulse light emitted from the pulse light emission unit (22) according to a drive mode set by the drive mode setting unit (20A), and controls exposure in the distance image sensor (14). The distance image generation unit (20B) performs calculation processing of a sensor output acquired from the distance image sensor (14) according to the drive mode set by the drive mode setting unit (20A) to generate a distance image corresponding to a distance of a subject.
SYSTEMS AND METHODS FOR EFFICIENT MULTI-RETURN LIGHT DETECTORS
Described herein are systems and methods that may efficiently detect multi-return light signals. A light detection and ranging system, such as a LIDAR system, may fire a laser beam that may hit multiple objects with a different distance in one line, causing multi-return light signals to be received by the system. Multi-return detectors may be able to analyze the peak magnitude of a plurality of peaks in the return signals and determine a multitude of peaks, such as the first peak, the last peak and the maximum peak. One embodiment to detect the multi-return light signals may be a multi-return recursive matched filter detector. This detector comprises a matched filter, peak detector, centroid calculation and a zeroing out function. Other embodiments may be based on a maximum finder that algorithmically selects the highest magnitude peaks from samples of the return signal and buffers for regions of interests peaks.
SYSTEMS AND METHODS FOR EFFICIENT MULTI-RETURN LIGHT DETECTORS
Described herein are systems and methods that may efficiently detect multi-return light signals. A light detection and ranging system, such as a LIDAR system, may fire a laser beam that may hit multiple objects with a different distance in one line, causing multi-return light signals to be received by the system. Multi-return detectors may be able to analyze the peak magnitude of a plurality of peaks in the return signals and determine a multitude of peaks, such as the first peak, the last peak and the maximum peak. One embodiment to detect the multi-return light signals may be a multi-return recursive matched filter detector. This detector comprises a matched filter, peak detector, centroid calculation and a zeroing out function. Other embodiments may be based on a maximum finder that algorithmically selects the highest magnitude peaks from samples of the return signal and buffers for regions of interests peaks.
POSITRON EMISSION TOMOGRAPHY SYSTEM WITH A TIME SYNCHRONIZED NETWORK
A sensor network, which includes a sensor controller serially coupled to a plurality of sensor modules, is configured to program the sensor modules so as to transfer measurement data to the sensor controller and to synchronize the sensor modules to picosecond accuracy via on-chip or on-module custom circuits and a physical layer protocol. The sensor network has applications for use in PET, LiDAR or FLIM applications. Synchronization, within picosecond accuracy, is achieved through use of a picosecond time digitization circuit. Specifically, the picosecond time digitization circuit is used to measure on-chip delays with high accuracy and precision. The delay measurements are directly comparable between separate chips even with voltage and temperature variations between chips.
POSITRON EMISSION TOMOGRAPHY SYSTEM WITH A TIME SYNCHRONIZED NETWORK
A sensor network, which includes a sensor controller serially coupled to a plurality of sensor modules, is configured to program the sensor modules so as to transfer measurement data to the sensor controller and to synchronize the sensor modules to picosecond accuracy via on-chip or on-module custom circuits and a physical layer protocol. The sensor network has applications for use in PET, LiDAR or FLIM applications. Synchronization, within picosecond accuracy, is achieved through use of a picosecond time digitization circuit. Specifically, the picosecond time digitization circuit is used to measure on-chip delays with high accuracy and precision. The delay measurements are directly comparable between separate chips even with voltage and temperature variations between chips.
TIME-OF-FLIGHT IMAGING CIRCUITRY, TIME-OF-FLIGHT IMAGING SYSTEM, TIME-OF-FLIGHT IMAGING METHOD
The present disclosure generally pertains to a time-of-flight imaging circuitry configured to: obtain first image data from an image sensor, the first image data being indicative of a scene, which is illuminated with spotted light; determine a first image feature in the first image data; obtain second image data from the image sensor, the second image data being indicative of the scene; determine second image feature in the second image data; estimate a motion of the second image feature with respect to the first image feature; and merge the first and the second image data based on the estimated motion.
EVENT-BASED COMPUTATIONAL PIXEL IMAGERS
A computational pixel imaging device that includes an array of pixel integrated circuits for event-based detection and imaging. Each pixel may include a digital counter that accumulates a digital number, which indicates whether a change is detected by the pixel. The counter may count in one direction for a portion of an exposure and count in an opposite direction for another portion of the exposure. The imaging device may be configured to collect and transmit key frames at a lower rate, and collect and transmit delta or event frames at a higher rate. The key frames may include a full image of a scene, captured by the pixel array. The delta frames may include sparse data, captured by pixels that have detected meaningful changes in received light intensity. High speed, low transmission bandwidth motion image video can be reconstructed using the key frames and the delta frames.
TIME OF FLIGHT SENSOR RECORDED WITH COMPENSATION PARAMETERS
There is provided a time of flight sensor including a light source, a first pixel, a second pixel and a processor. The first pixel generates a first output signal without receiving reflected light from an external object illuminated by the light source. The second pixel generates a second output signal by receiving the reflected light from the external object illuminated by the light source. The processor calculates deviation compensation and deviation correction associated with temperature variation according to the first output signal to accordingly calibrate a distance calculated according to the second output signal.
TIME OF FLIGHT SENSOR RECORDED WITH COMPENSATION PARAMETERS
There is provided a time of flight sensor including a light source, a first pixel, a second pixel and a processor. The first pixel generates a first output signal without receiving reflected light from an external object illuminated by the light source. The second pixel generates a second output signal by receiving the reflected light from the external object illuminated by the light source. The processor calculates deviation compensation and deviation correction associated with temperature variation according to the first output signal to accordingly calibrate a distance calculated according to the second output signal.