Patent classifications
H04N5/341
Capturing and processing of images using camera array incorperating Bayer cameras having different fields of view
Systems and methods for implementing array cameras configured to perform super-resolution processing to generate higher resolution super-resolved images using a plurality of captured images and lens stack arrays that can be utilized in array cameras are disclosed. An imaging device in accordance with one embodiment of the invention includes at least one imager array, and each imager in the array comprises a plurality of light sensing elements and a lens stack including at least one lens surface, where the lens stack is configured to form an image on the light sensing elements, control circuitry configured to capture images formed on the light sensing elements of each of the imagers, and a super-resolution processing module configured to generate at least one higher resolution super-resolved image using a plurality of the captured images.
Systems and methods for array camera focal plane control
Systems and methods for controlling the parameters of groups of focal planes as focal plane groups in an array camera are described. One embodiment includes a plurality of focal planes, and control circuitry configured to control the capture of image data by the pixels within the focal planes. In addition, the control circuitry includes: a plurality of parameter registers, where a given parameter register is associated with one of the focal planes and contains configuration data for the associated focal plane; and a focal plane group register that contains data identifying focal planes that belong to a focal plane group. Furthermore, the control circuitry is configured to control the imaging parameters of the focal planes in the focal plane groups by mapping instructions that address virtual register addresses to the addresses of the parameter registers associated with focal planes within specific focal plane groups.
Multiplane panoramas of long scenes
Methods, systems, and articles of manufacture for generating a panoramic image of a long scene, are disclosed. These include, fitting a plurality of planes to 3D points associated with input images of portions of the long scene, where one or more respective planes are fitted to each of a ground surface, a dominant surface, and at least one of one or more foreground objects and one or more background objects in the long scene, and where distances from the 3D points to the fitted planes are substantially minimized. These also include, selecting, for respective one or more pixels in the panoramic image of the long scene, one of the input images and one of the fitted planes such that a distance is substantially minimized from the selected one of the fitted planes to a surface corresponding to the respective one or more pixels and occlusion of the respective one or more pixels is reduced in the selected one of the input images; and stitching the panoramic image of the long scene by projecting, for the respective one or more pixels in the panoramic image of the long scene, the selected one of the input images using the selected one of the fitted planes into the virtual camera.
Synchronized solid-state imaging element, imaging device, and electronic device
There is provided a solid-state imaging element in which a first substrate in which a pixel circuit including a pixel array unit is formed and a second substrate in which a plurality of signal processing circuits are formed are laminated, and a common reference clock is supplied to the plurality of signal processing circuits that are formed on the second substrate.
Data rate control for event-based vision sensor
In dynamic vision sensor (DVS) or change detection sensors, the chip or sensor is configured to control or modulate the event rate. For example, this control can be used to keep the event rate close to a desired rate or within desired bounds. Adapting the configuration of the sensor to the scene by changing the ON-event and/or the OFF-event thresholds, allows having necessary amount of data, but not much more than necessary, such that the overall system gets as much information about its state as possible.
Imaging device and signal processing device
The present disclosure relates to an imaging device and a signal processing device capable of expanding an application range of the imaging device. An imaging device includes an imaging element that includes one or more pixel output units that receive incident light from a subject incident without an intervention of an imaging lens or a pinhole and output one detection signal indicating an output pixel value modulated by an incident angle of the incident light, and outputs a detection signal set including one or more detection signals, and a communication unit that transmits imaging data including the detection signal set and position attitude data indicating at least one of a position or an attitude to a communication device by wireless communication. The present disclosure is applicable to, for example, a monitoring system and the like.
IMAGING DEVICE AND METHOD OF DRIVING IMAGING DEVICE
An imaging device includes a plurality of pixels, each of which including a first pixel element including a first photoelectric conversion unit, and a second pixel element including a second photoelectric conversion unit lower in sensitivity than the first photoelectric conversion unit, a readout circuit configured to read out a first image signal based on signal charge generated in the first pixel element during a first accumulation period, and a second image signal based on signal charge generated in the second pixel element during a second accumulation period longer than the first accumulation period, the second image signal being synchronized with the first image signal, and a control unit configured to control the readout circuit so that a first readout period for reading out the first image signal is performed during the second accumulation period.
Method of imaging by multiple cameras, storage medium, and electronic device
An imaging method, an apparatus for imaging, a non-transitory storage medium, and an electronic device are provided. The electronic device includes a first camera and a plurality of second cameras. Regions captured by the plurality of second cameras overlap with an edge of a region captured by the first camera. An image capturing request for an object to be captured is received, the first camera may be driven to perform image capturing to the object based on the image capturing request to obtain a base image, the plurality of second cameras may be driven to perform image capturing to obtain a plurality of second images, image synthesis may be performed to the plurality of second images and the base image to obtain a final image corresponding to the image capturing request.
Photoelectric conversion apparatus, imaging system, and moving object, with high sensitivity and saturation charge
A solid-state imaging apparatus includes first, second, and third semiconductor regions. The third semiconductor region has a second conductivity type. The third semiconductor region extends from a region below the second semiconductor region of a first pixel to a region below the second semiconductor region of a second pixel in the first and second pixels adjacent to each other among a plurality of pixels.
SYNCHRONIZED CAMERA SYSTEM HAVING TWO DIFFERENT CAMERAS
The invention relates to a camera system, and to a method for controlling the camera system, for capturing images of the surroundings of a vehicle for a driver assistance system of the vehicle. The camera system comprises a first rolling shutter camera (1) having a first aperture angle α, a second rolling shutter camera (2) having a second aperture angle β, and control electronics. The first camera (1) is suitable for generating a wide-angle camera image, that is, the first aperture angle α is greater than the second aperture angle β of the second camera (2) which is suitable for generating a tele camera image. The two cameras (1, 2) are designed in such a way that both camera images have an overlap region.
The control electronics is configured to synchronize the two cameras (1, 2).
The geometric arrangement of the two cameras (1, 2) with respect to one another, and the position of the overlap region (10) in the wide-angle image and in the tele camera image, are determined by means of continuous estimation.
The stored geometric arrangement and position are taken into consideration during synchronization of the first camera (1) and the second camera (2) of the camera system.