G06V10/143

Detection system
09852519 · 2017-12-26 · ·

A detection system including a light source, an image sensor and a processor is provided. The light source is configured to illuminate an object. The image sensor is configured to output a picture. The processor is configured to generate an IR picture and a color picture according to the picture captured by the image sensor, identify a skin-color object in the color picture and determine an object image in the IR picture according to the skin-color object.

Facial analysis for vehicle entertainment system metrics
09852355 · 2017-12-26 · ·

A vehicle entertainment system includes a video display unit, a camera, a communication interface, and a processor. The video display unit provides content to a user. The camera outputs a camera signal containing data representing the user's face. The communication interface communicate with a central content usage analysis computer. The processor processes the camera signal to identify facial features, compares the facial features to defined demographics rules, identifies user demographics based on the comparison of the facial features to the defined demographics rules, correlates the user demographics to a timeline of content consumed by the user through the video display unit to generate enhanced content usage metrics, and communicates the enhanced content usage metrics through the communication interface for delivery to the central content usage analysis computer.

Automatic exposure and gain control for face authentication

This document describes techniques and systems that enable automatic exposure and gain control for face authentication. The techniques and systems include a user device initializing a gain for a near-infrared camera system using a default gain. The user device ascertains patch-mean statistics of one or more regions-of-interest of a most-recently captured image that was captured by the near-infrared camera system. The user device computes an update in the initialized gain to provide an updated gain that is usable to scale the one or more regions-of-interest toward a target mean-luminance value. The user device dampens the updated gain by using hysteresis. Then, the user device sets the initialized gain for the near-infrared camera system to the dampened updated gain.

Automatic exposure and gain control for face authentication

This document describes techniques and systems that enable automatic exposure and gain control for face authentication. The techniques and systems include a user device initializing a gain for a near-infrared camera system using a default gain. The user device ascertains patch-mean statistics of one or more regions-of-interest of a most-recently captured image that was captured by the near-infrared camera system. The user device computes an update in the initialized gain to provide an updated gain that is usable to scale the one or more regions-of-interest toward a target mean-luminance value. The user device dampens the updated gain by using hysteresis. Then, the user device sets the initialized gain for the near-infrared camera system to the dampened updated gain.

Virtual vehicle generation by multi-spectrum scanning

A method and system for generating a three-dimensional representation of a vehicle to assess damage to the vehicle. A mobile device may capture multispectral scans of a vehicle from each a plurality of cameras configured to scan the vehicle at a different wavelength of the electromagnetic spectrum. A virtual model of the vehicle may be generated from the multispectral scan of the vehicle, such that anomalous conditions or errors in individual wavelength data are omitted from model generation. A representation of the virtual model may be presented to the user via the display of the mobile device. The virtual model of the vehicle may further be analyzed to assess damage to the vehicle.

Virtual vehicle generation by multi-spectrum scanning

A method and system for generating a three-dimensional representation of a vehicle to assess damage to the vehicle. A mobile device may capture multispectral scans of a vehicle from each a plurality of cameras configured to scan the vehicle at a different wavelength of the electromagnetic spectrum. A virtual model of the vehicle may be generated from the multispectral scan of the vehicle, such that anomalous conditions or errors in individual wavelength data are omitted from model generation. A representation of the virtual model may be presented to the user via the display of the mobile device. The virtual model of the vehicle may further be analyzed to assess damage to the vehicle.

Iris image acquisition system

An iris image acquisition system comprises an image sensor comprising an array of pixels including pixels sensitive to NIR wavelengths; at least one NIR light source capable of selectively emitting light with different discrete NIR wavelengths; and a processor, operably connected to the image sensor and the at least one NIR light source, to acquire image information from the sensor under illumination at one of the different discrete NIR wavelengths. A lens assembly comprises a plurality of lens elements with a total track length no more than 4.7 mm, each lens element comprising a material with a refractive index inversely proportional to wavelength. The different discrete NIR wavelengths are matched with the refractive index of the material for the lens elements to balance axial image shift induced by a change in object distance with axial image shift due to change in illumination wavelength.

COMBINED BIOMETRICS CAPTURE SYSTEM WITH AMBIENT FREE IR
20170364736 · 2017-12-21 · ·

An apparatus for a combined camera system is described herein. The apparatus includes an adjustable infrared (IR) pass filter. A passband of the adjustable infrared (IR) pass filter is electrically adjusted. The apparatus also includes a rolling shutter sensor. An adjustable filter is to implement a global shutter and a rolling shutter sensor global reset.

Systems and methods for using hyperspectral data to produce a unified three-dimensional scan that incorporates depth
11688087 · 2023-06-27 · ·

An encoder is disclosed that uses hyperspectral data to produce a unified three-dimensional (“3D”) scan that incorporates depth for various points, surfaces, and features within a scene. The encoder may scan a particular point of the scene using frequencies from different electromagnetic spectrum bands, may determine spectral properties of the particular point based on returns measured across a first set of bands, may measure a distance of the particular point using frequencies of another band that does not interfere with the spectral properties at each of the first set of bands, and may encode the spectral properties and the distance of the particular point in a single hyperspectral dataset. The spectral signature encoded within the dataset may be used to classify the particular point or generate a point cloud or other visualization that accurately represents the spectral properties and distances of the scanned points.

Systems and methods for using hyperspectral data to produce a unified three-dimensional scan that incorporates depth
11688087 · 2023-06-27 · ·

An encoder is disclosed that uses hyperspectral data to produce a unified three-dimensional (“3D”) scan that incorporates depth for various points, surfaces, and features within a scene. The encoder may scan a particular point of the scene using frequencies from different electromagnetic spectrum bands, may determine spectral properties of the particular point based on returns measured across a first set of bands, may measure a distance of the particular point using frequencies of another band that does not interfere with the spectral properties at each of the first set of bands, and may encode the spectral properties and the distance of the particular point in a single hyperspectral dataset. The spectral signature encoded within the dataset may be used to classify the particular point or generate a point cloud or other visualization that accurately represents the spectral properties and distances of the scanned points.