Patent classifications
G06T7/85
Field calibration of a structured light range-sensor
The technology described herein recalibrates a structured light sensor in the field using time-of-flight sensor data. Structured light sensors are sensitive to mechanical changes that result in decreased accuracy. A structured light system calculates the range to an object by comparing a reference image to the actual image of the scene. The reference image is what the projected light pattern would look like on a flat object at a known distance. When the projected image changes, the reference image no longer matches the projected pattern. The calibration technology described herein captures a new reference image based on the current sensor characteristics using a time-of-flight capable sensor as the structured light imaging sensor.
MULTI-VIEW IMAGE FUSION BY IMAGE SPACE EQUALIZATION AND STEREO-BASED RECTIFICATION FROM TWO DIFFERENT CAMERAS
Methods to solve the problem of performing fusion of images acquired with two cameras with different type sensors, for example a visible (VIS) digital camera and an short wave infrared (SWIR) camera, include performing image space equalization on images acquired with the different type sensors before performing rectification and registration of such images in a fusion process.
METHODS, SYSTEMS, APPARATUSES, AND DEVICES FOR FACILITATING MANAGING CULTIVATION OF CROPS BASED ON MONITORING THE CROPS
Disclosed herein is an apparatus for facilitating managing cultivation of crops based on monitoring the crops. Further, the apparatus comprises an apparatus body, cameras, light sensors, a processing unit, and a communication interface. Further, the cameras generate a measurement of a crop and a field portion. Further, the light sensors generate an environment measurement of an environment of the apparatus. Further, the processing unit analyzes the environment measurement, determines a factor affecting the measurement, and generates a calibrating factor for the cameras. Further, the calibrating factor facilitates compensating the affecting of the factor in the measurement. Further, the cameras calibrate a camera parameter of the cameras based on the calibrating factor to generate the measurement. Further, the processing unit analyzes the measurement and generates a status of the crop. Further, the communication interface transmits the status to a device.
INTRINSIC PARAMETERS ESTIMATION IN VISUAL TRACKING SYSTEMS
A method for adjusting camera intrinsic parameters of a multi-camera visual tracking device is described. In one aspect, a method for calibrating the multi-camera visual tracking system includes disabling a first camera of the multi-camera visual tracking system while a second camera of the multi-camera visual tracking system is enabled, detecting a first set of features in a first image generated by the first camera after detecting that the temperature of the first camera is within the threshold of the factory calibration temperature of the first camera, and accessing and correcting intrinsic parameters of the second camera based on the projection of the first set of features in the second image and a second set of features in the second image.
Method of calibrating a patient monitoring system for use with a radiotherapy treatment apparatus
Some embodiments are directed to an image director of a patient monitoring system to obtain calibration images of a calibration sheet or other calibration object at various orientations and locations. The images are then stored and processed to calculate camera parameters defining the location and orientation of the image detector and identifying internal characteristics of the image detector, and the information are stored. The patient monitoring system can be re-calibrated by using the image detector to obtain an additional image of a calibration sheet or calibration object. The additional image and the stored camera parameters are then used to detect any apparent change in the internal characteristics of the image detector (10)(S6-4).
Microscope made with CMOS camera(s)
A medical/surgical microscope with two cameras configured to capture two dimensional images of specimens being observed. The medical/surgical microscope is secured to a control apparatus configured to adjust toe-in of the two cameras to insure the convergence of the images. The medical/surgical microscope includes a computer system with a non-transitory memory apparatus for storing computer program code configured for digitally rendering real-world medical/surgical images. The medical/surgical microscope has an illumination system with controls for focusing and regulating the lighting of a specimen. The medical/surgical microscope is configured for real-time video display with the function of recording and broadcasting simultaneously during surgery.
Calibration device for imaging device, monitoring device, work machine and calibration method
A calibration device for an imaging device includes an imaging data acquisition unit that acquires imaging data of a known external target installed at a known position outside a work range of work equipment, the imaging data being obtained by imaging of at least one imaging device provided in a work machine including the work equipment, an external target position acquisition unit that acquires a position of the known external target, and a calibration unit that calibrates the imaging device based on the position of the known external target, which is acquired by the external target position acquisition unit, and the imaging data of the known external target, which is acquired by the imaging data acquisition unit.
Smart phones for motion capture
A series of smart phones are mounted in respective tripods to capture motion of a person wearing markers, such as marker balls or reflectors. The videos from the phones are stripped of objects other than the markers and the videos of the markers are combined to render a 3D motion capture structure that may be applied to an image of a VR icon to cause the VR icon to move as the person originally moved.
DEPTH-FROM-STEREO BENDING CORRECTION USING VISUAL INERTIAL ODOMETRY FEATURES
A method for correcting a bending of a flexible device is described. In one aspect, the method includes accessing feature data of a first stereo frame that is generated by stereo optical sensors of the flexible device, the feature data generated based on a visual-inertial odometry (VIO) system of the flexible device, accessing depth map data of the first stereo frame, the depth map data generated based on a depth map system of the flexible device, estimating a pitch-roll bias and a yaw bias based on the features data and the depth map data of the first stereo frame, and generating a second stereo frame after the first stereo frame, the second stereo frame based on the pitch-roll bias and the yaw bias of the first stereo frame.
Optical tracking device with built-in structured light module
A system is disclosed that includes an optical tracking device and a surgical computing device. The optical tracking device includes a structured light module and an optical module that includes an image sensor and is spaced from the structured light module at a known distance. The surgical computing device includes a display device, a non-transitory computer readable medium including instructions, and processor(s) configured to execute the instructions to generate a depth map from a first image captured by the image sensor during projection of a pattern into a surgical environment by the structured light module. The pattern is projected in a near-infrared (NIR) spectrum. The processor(s) are further configured to execute the stored instructions to reconstruct a 3D surface of anatomical structure(s) based on the generated depth map. Additionally, the processor(s) are configured to execute the stored instructions to output the reconstructed 3D surface to the display device.