G01B11/22

Camera module and depth map extraction method thereof

A camera module according to one embodiment of the present invention comprises: an illumination unit for outputting a signal of incident light irradiated to an object; a lens unit for collecting a signal of reflection light reflected from the object, an image sensor unit for generating an electric signal from a reflection light signal collected by the lens unit, a tilting unit for shifting an optical path of the reflection light signal, and an image control unit for extracting a depth map of the object by using a phase difference between the incident light signal with respect to a frame having shifted by the tilting unit and the reflection light signal received by the image sensor unit, wherein the lens unit is disposed on the image sensor unit and includes an infrared (IR) filter disposed on the image sensor unit and at least one lens disposed on the infrared filter, and the tilting unit controls tilt of the infrared filter.

Tire tread gauge using visual indicator
11566972 · 2023-01-31 · ·

An electronic battery tester for testing a storage battery includes a Kelvin connection configured to electrically couple to the storage battery and a microprocessor configured to determine a dynamic parameter of the storage battery. A forcing function source is configured to apply a forcing function signal to the storage battery through the Kelvin connection. A sensor is electrically coupled to the storage battery and configured to sense an electrical response of the storage battery to the applied forcing function signal. A tire tread gauge is arranged to be inserted into a tread of a tire. The tire tread gauge including a visual indicator. An image capture device is configured to capture an image of the tire tread gauge when the tire tread gauge is inserted into the tread of the tire.

Determining features of a user's eye from depth mapping of the user's eye via indirect time of flight

An eye monitoring system is included in a headset of a virtual reality system or of an augmented reality system. The eye monitoring system determines distances between the eye monitoring system and portions of a user's eye enclosed by the headset. The eye monitoring system projects a temporally periodic pattern of light onto the user's eye via a sensor. The eye monitoring system determines a distance between the eye monitoring system and locations of the user's eye based on a phase shift of the periodic pattern of light captured by each pixel of the sensor. From the determined distances, the eye monitoring system determines features of the user's eye.

Determining features of a user's eye from depth mapping of the user's eye via indirect time of flight

An eye monitoring system is included in a headset of a virtual reality system or of an augmented reality system. The eye monitoring system determines distances between the eye monitoring system and portions of a user's eye enclosed by the headset. The eye monitoring system projects a temporally periodic pattern of light onto the user's eye via a sensor. The eye monitoring system determines a distance between the eye monitoring system and locations of the user's eye based on a phase shift of the periodic pattern of light captured by each pixel of the sensor. From the determined distances, the eye monitoring system determines features of the user's eye.

Tyre tread depth and tyre condition determination

A method for assessing tyre tread depth and/or tyre condition by taking and analysing a camera image or images of a tyre using portable instrumentation.

Tyre tread depth and tyre condition determination

A method for assessing tyre tread depth and/or tyre condition by taking and analysing a camera image or images of a tyre using portable instrumentation.

Z-PLANE IDENTIFICATION AND BOX DIMENSIONING USING THREE-DIMENSIONAL TIME-OF-FLIGHT IMAGING
20230228883 · 2023-07-20 ·

A sensor system that obtains and processes time-of-flight data (TOF) is provided. A TOF sensor obtains raw data describing various surfaces. A processor applies an averaging filter to the raw data to smooth the raw data for increasing signal-to-noise ratio (SNR) of flat surfaces represented in the raw data, performs a depth compute process on the raw data, as filtered, to generate distance data, generates a point cloud based on the distance data, and identifies the Z-planes in the point cloud.

Z-PLANE IDENTIFICATION AND BOX DIMENSIONING USING THREE-DIMENSIONAL TIME-OF-FLIGHT IMAGING
20230228883 · 2023-07-20 ·

A sensor system that obtains and processes time-of-flight data (TOF) is provided. A TOF sensor obtains raw data describing various surfaces. A processor applies an averaging filter to the raw data to smooth the raw data for increasing signal-to-noise ratio (SNR) of flat surfaces represented in the raw data, performs a depth compute process on the raw data, as filtered, to generate distance data, generates a point cloud based on the distance data, and identifies the Z-planes in the point cloud.

ASSISTED VEHICLE OPERATION WITH IMPROVED OBJECT DETECTION

A vehicle control system includes a camera configured to capture image data depicting a field of view proximate the vehicle is disclosed. The vehicle control system further includes a plurality of light sources in connection with the vehicle and a controller. The controller is configured to activate a plurality of lights in an alternating pattern and capture light reflected from at least one object with the camera at a time and corresponding to the alternating pattern of the plurality of lights. In response to variations in the light impinging upon the at least one object from the alternating pattern, the controller is configured to identify a distance of the object.

ASSISTED VEHICLE OPERATION WITH IMPROVED OBJECT DETECTION

A vehicle control system includes a camera configured to capture image data depicting a field of view proximate the vehicle is disclosed. The vehicle control system further includes a plurality of light sources in connection with the vehicle and a controller. The controller is configured to activate a plurality of lights in an alternating pattern and capture light reflected from at least one object with the camera at a time and corresponding to the alternating pattern of the plurality of lights. In response to variations in the light impinging upon the at least one object from the alternating pattern, the controller is configured to identify a distance of the object.