Patent classifications
H04N17/002
SIGNAL DELAY MEASUREMENT
A method for rolling shutter compensation during signal delay measurement, comprising displaying a video test pattern on a display, said video test pattern having a temporal event; capturing a video of the display, by a camera; monitoring a plurality of regions of the display in the video; detecting times (1230, 1240) at which the temporal event appears in each monitored region of the display in the video; and extrapolating the detected times (1230, 1240) to calculate the time (1250) at which said temporal event would appear at a selected region of the video.
Luminescent cold shield paneling for infrared camera continuous non-uniformity correction
A luminescent diode surface within the cold shield of an infrared camera to allow for continuous non-uniformity correction with uniform irradiance across an infrared IR detector array. Further provided by the inclusion of a luminescent diode surface within the cold shield paneling is the ability to change the diode bias providing a negative luminescent effect while utilizing reverse bias and an electro-luminescent effect while utilizing a forward bias. This may then further allow for multiple set points to provide continuous offset and gain correction and to correct non-linear response effects.
Methods and Systems for Calibrating a Camera
A computer implemented method for calibrating a camera comprises the following steps carried out by computer hardware components: activating a subset of a plurality of light sources according to a plurality of activation schemes, wherein each activation scheme indicates which of the plurality of light sources to activate; capturing an image for each activation scheme using the camera; and calibrating the camera based on the captured images.
CAMERA TRACKING SYSTEM FOR LIVE COMPOSITING
A 3D camera tracking and live compositing system includes software and hardware integration and allows users to create, in conjunction with existing programs, live composite video. A video camera, a tracking sensor, encoder, a composite monitor, and a software engine and plugin receive video and data from and integrate it with existing programs to generate real time composite video. The composite feed can be viewed and manipulated by users while filming. Features include 3D masking, depth layering, teleporting, axis locking, motion scaling, and freeze tracking. A storyboarding archive can be used to quickly load scenes with the location, lighting setups, lens profiles and other settings associated with a saved a photo. The video camera's movements can be recorded with video to be later applied to other 3D digital assets in post-production. The system also allows users to load scenes based on a 3D data set created with LIDAR.
METHOD AND APPARATUS FOR MODELING DYNAMIC INTRINSIC PARAMETERS OF A CAMERA
Apparatuses, systems, and methods dynamically model intrinsic parameters of a camera. Methods include: collecting, using a camera having a focus motor, calibration data at a series of discrete focus motor positions; generating, from the calibration data, a set of constant point intrinsic parameters; determining, from the set of constant point intrinsic parameters, a subset of intrinsic parameters to model dynamically; performing, for each intrinsic parameter of the subset of intrinsic parameters, a fit of the point intrinsic parameter values against focus motor positions; generating a model of the intrinsic parameters for the camera based, at least in part, on the fit of the point intrinsic parameter values against the focus motor positions; and determining a position of a fiducial marker within a field of view of the camera based, at least in part, on the model of the intrinsic parameters for the camera.
CALIBRATING SYSTEM FOR COLORIZING POINT-CLOUDS
A system includes a three-dimensional (3D) scanner that captures a 3D point cloud corresponding to one or more objects in a surrounding environment. The system further includes a camera that captures a control image by capturing a plurality of images of the surrounding environment, and an auxiliary camera configured to capture an ultrawide-angle image of the surrounding environment. One or more processors of the system colorize the 3D point cloud using the ultrawide-angle image by mapping the ultrawide-angle image to the 3D point cloud. The system performs a limited system calibration before colorizing each 3D point cloud, and a periodic full system calibration before/after a plurality of 3D point clouds are colorized.
RESOLUTION TEST CARD FOR CAMERA MODULE
A resolution test card for camera module includes a central region and a plurality of corner regions surrounding the central region. Both the central region and each corner region includes color blocks arranged in an array without gap. Each color block includes at least two straight edges. Any adjacent two color blocks sharing one straight edge in both the central region and each corner region have different colors. Any of the at least two straight edges is inclined relative to a first direction and a second direction perpendicular to the first direction; the at least two straight edges includes a first straight edge and a second straight edge perpendicular to each other.
IMAGE FUSION METHOD AND BIFOCAL CAMERA
Embodiments of the present application are an image fusion method and a bifocal camera. The method includes: acquiring a thermal image captured by the thermal imaging lens and a visible light image captured by the visible light lens; determining a first focal length when the thermal imaging lens captures the thermal image and a second focal length when the visible light lens captures the visible light image; determining a size calibration parameter and a position calibration parameter of the thermal image according to the first focal length and the second focal length; adjusting a size of the thermal image according to the size calibration parameter, and moving the adjusted thermal image to the visible light image according to the position calibration parameter for registration with the visible light image, to obtain to-be-fused images; and fusing the to-be-fused images to generate a bifocal fused image.
Image Interpolation Method and Device Based on RGB-D Image and Multi-Camera System
The present invention discloses an image interpolation method and device based on RGB-D images and a multi-camera system, wherein the method comprises performing camera calibration on each camera in the multi-camera system; clarifying a position of a new camera for interpolation according to position information of the each camera in the multi-camera system, and calculating a camera pose of the new camera according to camera calibration data; calculating a plurality of initial interpolated images that have a one-to-one correspondence with designated images captured by the each camera of the multi-camera system according to a projection relationship of the camera and the pose information of the each camera; performing image fusion on each initial interpolated image to obtain a fused interpolated image; and performing pixel completion on the fused interpolated image so as to obtain an interpolated image related to the new camera.
Imaging processing apparatus and method extracting a second RGB ToF feature points having a correlation between the first RGB and TOF feature points
An image processing apparatus and method of extracting a second RGB feature point and a second ToF feature point such that a correlation between the first RGB feature point and the first ToF feature point is equal to or greater than a predetermined value; calculating an error value between the second RGB feature point and the second ToF feature point; updating pre-stored calibration data when the error value is greater than a threshold value, and calibrating the RGB image and the ToF image by using the updated calibration data; and synthesizing the calibrated RGB and ToF images.