H04N23/16

COMPUTER-VISION-BASED OBJECT TRACKING AND GUIDANCE MODULE

An apparatus comprises a mount body by which the apparatus is secured to a structure. A camera assembly, fixed to the mount body, includes an image sensor adapted to capture images within its field of view. A lighting assembly, rotatably connected to the mount body, houses one or more light sources including a directional light source. A control-board assembly, fixed to the mount body, houses control boards including one or more processors configured to acquire information about an object, to associate a location within the field of view of the image sensor with the object, to point light emitted by the directional light source at the location associated with the object by rotating the lighting assembly and turning the laser assembly, and, based on an image acquired from the camera assembly, to detect change within the field of view of the image sensor corresponding to placement or removal of the object.

THIN DUAL-APERTURE ZOOM DIGITAL CAMERA
20200209645 · 2020-07-02 ·

A dual-aperture zoom camera comprising a Wide camera with a respective Wide lens and a Tele camera with a respective Tele lens, the Wide and Tele cameras mounted directly on a single printed circuit board, wherein the Wide and Tele lenses have respective effective focal lengths EFL.sub.W and EFL.sub.T and respective total track lengths TTL.sub.W and TTL.sub.T and wherein TTL.sub.W/EFL.sub.W>1.1 and TTL.sub.T/EFL.sub.T<1.0. Optionally, the dual-aperture zoom camera may further comprise an optical OIS controller configured to provide a compensation lens movement according to a user-defined zoom factor (ZF) and a camera tilt (CT) through LMV=CT*EFL.sub.ZF, where EFL.sub.ZF is a zoom-factor dependent effective focal length.

Capturing and processing of images including occlusions focused on an image sensor by a lens stack array

Systems and methods for implementing array cameras configured to perform super-resolution processing to generate higher resolution super-resolved images using a plurality of captured images and lens stack arrays that can be utilized in array cameras are disclosed. An imaging device in accordance with one embodiment of the invention includes at least one imager array, and each imager in the array comprises a plurality of light sensing elements and a lens stack including at least one lens surface, where the lens stack is configured to form an image on the light sensing elements, control circuitry configured to capture images formed on the light sensing elements of each of the imagers, and a super-resolution processing module configured to generate at least one higher resolution super-resolved image using a plurality of the captured images.

Digital cameras with direct luminance and chrominance detection

Digital camera systems and methods are described that provide a color digital camera with direct luminance detection. The luminance signals are obtained directly from a broadband image sensor channel without interpolation of RGB data. The chrominance signals are obtained from one or more additional image sensor channels comprising red and/or blue color band detection capability. The red and blue signals are directly combined with the luminance image sensor channel signals. The digital camera generates and outputs an image in YCrCb color space by directly combining outputs of the broadband, red and blue sensors.

Systems and methods for lensed and lensless optical sensing of binary scenes
10653313 · 2020-05-19 · ·

A sensing device with an odd-symmetry grating projects near-field spatial modulations onto an array of closely spaced pixels. Due to physical properties of the grating, the spatial modulations are in focus for a range of wavelengths and spacings. The spatial modulations are captured by the array, and photographs and other image information can be extracted from the resultant data. Pixels responsive to infrared light can be used to make thermal imaging devices and other types of thermal sensors. Some sensors are well adapted for tracking eye movements, and others for imaging barcodes and like binary images. In the latter case, the known binary property of the expected images can be used to simplify the process of extracting image data.

Imaging device and imaging method for capturing a visible image and a near-infrared image
10659703 · 2020-05-19 · ·

An illumination radiates a visible light or a near-infrared light. A lens images a light from a subject. A beam splitter disperses the visible light and the near-infrared light. A color imaging device images a reflecting light from the subject illuminated with the visible light and includes an imaging device having a red filter. A black-and-white imaging device images the near-infrared light dispersed by the beam splitter. A pixel pitch of the black-and-white imaging device that images the near-infrared light dispersed is larger than a pixel pitch of the color imaging device. A sampling position for the near-infrared light is displaced in a pixel arrangement horizontally or vertically with respect to a sampling position for red in a color image.

IMAGING APPARATUS, IMAGING METHOD, AND COMPUTER READABLE RECORDING MEDIUM
20200120314 · 2020-04-16 · ·

An imaging apparatus includes: an imaging sensor including a plurality of pixels; a color filter including a plurality of filters arranged to correspond to the pixels; a first light source configured to irradiate a subject with visible light; a second light source configured to irradiate the subject with near-infrared light; a first processor including hardware, the first processor being configured to control an irradiation timing of each of the first light source and the second light source; and a second processor including hardware, the second processor being configured to generate a plurality of pieces of near-infrared image data on different near-infrared regions based on first image data and second image data, the first image data being generated by the imaging sensor by capturing an image of the subject, the second image data being generated by the imaging sensor by capturing an image of the subject.

Thin dual-aperture zoom digital camera
10620450 · 2020-04-14 · ·

A dual-aperture zoom camera comprising a Wide camera with a respective Wide lens and a Tele camera with a respective Tele lens, the Wide and Tele cameras mounted directly on a single printed circuit board, wherein the Wide and Tele lenses have respective effective focal lengths EFL.sub.W and EFL.sub.T and respective total track lengths TTL.sub.W and TTL.sub.T and wherein TTL.sub.W/EFL.sub.W>1.1 and TTL.sub.T/EFL.sub.T<1.0. Optionally, the dual-aperture zoom camera may further comprise an optical OIS controller configured to provide a compensation lens movement according to a user-defined zoom factor (ZF) and a camera tilt (CT) through LMV=CT*EFL.sub.ZF, where EFL.sub.ZF is a zoom-factor dependent effective focal length.

HD color imaging using monochromatic CMOS image sensors integrated in 3D package

HD color video using monochromatic CMOS image sensors integrated in a 3D package is provided. An example 3DIC package for color video includes a beam splitter to partition received light of an image stream into multiple light outputs. Multiple monochromatic CMOS image sensors are each coupled to one of the multiple light outputs to sense a monochromatic image stream at a respective component wavelength of the received light. Each monochromatic CMOS image sensor is specially constructed, doped, controlled, and tuned to its respective wavelength of light. A parallel processing integrator or interposer chip heterogeneously combines the respective monochromatic image streams into a full-spectrum color video stream, including parallel processing of an infrared or ultraviolet stream. The parallel processing of the monochromatic image streams provides reconstruction to HD or 4K HD color video at low light levels. Parallel processing to one interposer chip also enhances speed, spatial resolution, sensitivity, low light performance, and color reconstruction.

Multi-aperture camera system having auto focusing function and/or depth estimation function

A multi-aperture camera system having an auto focusing function comprises: multiple apertures; an image sensor that creates multiple images by processing light signals introduced through the multiple apertures, respectively; and an auto focusing unit that determines a distance by which the image sensor moves relative to the multiple apertures by using the multiple images for auto focusing, wherein at least one of the multiple apertures has a central position that is misaligned with those of the remaining apertures other than the at least one aperture.