Patent classifications
H04N9/07
Multi-Camera Imaging System for Nanosatellites
A satellite imaging system uses multiple cameras. For example, the incoming light from a telescope section of the satellite goes through a dichroic beam splitter, with the standard visible spectrum going to a first camera and wavelengths outside of the standard visible spectrum, such as in the infrared or coastal blue range, are sent to a second camera, allowing image data from multiple wavelength ranges to be captured simultaneously. The image data from the different wavelengths of the two cameras can then be selectively recombined. In a more general case, there is a first range of wavelengths and a second range of wavelengths.
CMOS sensor with standard photosites
An image sensor having photosites forming an array (K×L) of K rows and L columns, including a first set of integrator circuits, with a first regulation by analog weighting in blocks of n×n′ photosites, said photosites belonging to n adjacent columns and to n′ adjacent rows, and a second set of integrator circuits, with a second regulation by analog weighting in blocks of m×m′ photosites, said photosites belonging to m adjacent columns and to m′ adjacent rows, n adjacent columns of a first set of columns of the array being connected to n×n′ integrator circuits of the first set, m adjacent columns of a second set of columns of the array being connected to m×m′ integrator circuits of the second set, n columns of the first set alternating with m columns of the second set to form the array of photosites.
Image processing device and image processing method
An image processing device includes a processor including hardware. The processor is configured to implement an image acquisition process that acquires an image captured by an image sensor that includes a first-color filter having first transmittance characteristics, a second-color filter having second transmittance characteristics, and a third-color filter having third transmittance characteristics, and a correction process that multiplies a pixel value that corresponds to a second color by a first coefficient to calculate a component value that corresponds to the overlapping region of the first transmittance characteristics and the third transmittance characteristics, and corrects a pixel value that corresponds to a first color, the pixel value that corresponds to the second color, and a pixel value that corresponds to a third color based on the component value that corresponds to the overlapping region. The second color is longer in wavelength than the first color and shorter in wavelength than the third color.
PROJECTION DEVICE
A projection device includes a projection module and a first camera module. The projection device has a first optical axis and configured to form a projection area, wherein a projection of the first optical axis on an X-Z plane of the projection device is perpendicular to an X-Y plane on which the projection area is formed. The first camera module is disposed on a side of the projection module and includes a second optical axis, wherein the first camera module is configured to form a first shooting area, the second optical axis forms a first angle Δθ1 with respect to the first optical axis, the projection area at least partially overlaps the first shooting area to form an overlapping area, and the first angle Δθ1 is a function of a distance between the projection module and the first camera module.
IMAGE ACQUISITION APPARATUS
An image acquisition apparatus including: a stage on which a specimen is mounted; an objective lens that collects light from the specimen; a stage driving part that drives the stage; an image capturing unit that acquires an image by photographing the light collected by the objective lens; an image-generating unit that generates a pasted image by pasting the captured image acquired by the image capturing unit; a storage unit that stores the pasted image; a color-difference calculating unit that calculates a degree of dissimilarity between a color of the pasted image and a color of the captured image prior to pasting; and a color-difference correcting unit that corrects the color of the captured image so as to match the color of the pasted image, on the basis of the calculated degree of dissimilarity.
SYSTEMS AND METHODS OF AUGMENTED REALITY GUIDED IMAGE CAPTURE
Some embodiments provide a mobile device configured to guide a user, via an augmented reality (AR) interface generated by the mobile device, to capture images of the physical object using a camera of the mobile device. The mobile device may be configured to obtain boundary information indicative of a boundary enclosing the physical object (e.g., a box enclosing the physical object). The mobile device may be configured to use the boundary information to determine positions from which the user is to capture images of the physical object. The mobile device may be configured to guide the user to capture the images using the AR interface by guiding the user to each of the positions in the AR interface (e.g., by generating on or more GUI elements in the AR interface that indicate a position from which the user is to capture an image).
Image-capturing device, image-processing device, image-processing method, and image-processing program
An image-capturing device includes: an image-capturing element in which a plurality of pixels, which have different spectral sensitivities, are arrayed in a two-dimensional matrix manner, and phase-difference detection pixels are arranged as some of the pixels; a phase-difference pixel discriminating unit that classifies the phase-difference detection pixels arranged in the image-capturing element as first pixels, which have a spectral sensitivity at which degradation of image quality is more difficult to discern for human eyes than the other spectral sensitivities, and second pixels which are the phase-difference detection pixels other than the first pixels; and a phase-difference pixel value correcting unit that subjects the first pixels classified by the phase-difference pixel discriminating unit to correction processing of a lower precision than that for the second pixels.
SIGNAL PROCESSING DEVICE AND SIGNAL PROCESSING METHOD
[Object] To make it possible to improve reading efficiency of pixel signals in a signal processing device in which rows of pixels having different pixel arrays are arranged at intervals of one line.
[Solution] Provided is a signal processing device, including: a pixel array unit configured to include first pixels, second pixels, third pixels, and fourth pixels which have different spectral sensitivity characteristics and are arranged in a matrix form; and a pixel signal reading unit configured to read pixel signals obtained from the plurality of pixels arranged in the pixel array unit. The first pixels are adjacent to the second pixels in a row direction and a column direction, the second pixels are arranged at a two-pixel pitch in the row direction and the column direction, the third pixels are adjacent to the second pixels in one diagonal direction, the fourth pixels are adjacent to the second pixels in the other diagonal direction, and the pixel signal reading unit adds and reads the pixel signals obtained from the plurality of first pixels.
IMAGING ELEMENT AND IMAGING APPARATUS
Manufacture of an imaging element in which light entering a pixel without being transmitted through a color filter arranged in the pixel is attenuated is simplified. An imaging element includes a pixel and an incident light attenuating section. The pixel includes a color filter through which light having a predetermined wavelength of light from a subject is transmitted, and a photoelectric conversion section generating charges responding to the light transmitted through the color filter. The incident light attenuating section is arranged between the subject and the color filter, and attenuates the light entering the photoelectric conversion section without being transmitted through the color filter arranged in the pixel.
IMAGE PROCESSING DEVICE, IMAGING DEVICE, AND IMAGING METHOD
An image signal corresponding to infrared light is separated from an image signal corresponding to visible light, the visible light including an infrared light signal, even when imaging is performed while performing the infrared light irradiation. An exposure control unit alternately repeats a first frame in which an exposure time is set to a predetermined first exposure time and a second frame in which the exposure time is set to a second exposure time longer than the first exposure time. An infrared light irradiation control unit performs irradiation of infrared light in a predetermined infrared light irradiation period which is equal to or less than an aggregate period of the first frame and the second frame. An image signal acquisition unit acquires a first image signal which is the image signal in the first frame and a second image signal which is the image signal in the second frame. An extraction unit extracts a visible light intensity per unit time and an infrared light intensity per unit time from the first image signal and the second image signal. A generation unit generates an image signal corresponding to visible light and an image signal corresponding to the infrared light on the basis of the visible light intensity per unit time and the infrared light intensity per unit time.