Patent classifications
H04N23/10
Auto-Focus Methods and Systems for Multi-Spectral Imaging
Techniques for acquiring focused images of a microscope slide are disclosed. During a calibration phase, a “base” focal plane is determined using non-synthetic and/or synthetic auto-focus techniques. Furthermore, offset planes are determined for color channels (or filter bands) and used to generate an auto-focus model. During subsequent scans, the auto-focus model can be used to quickly estimate the focal plane of interest for each color channel (or filter band) rather than re-employing the non-synthetic and/or synthetic auto-focus techniques.
Auto-Focus Methods and Systems for Multi-Spectral Imaging
Techniques for acquiring focused images of a microscope slide are disclosed. During a calibration phase, a “base” focal plane is determined using non-synthetic and/or synthetic auto-focus techniques. Furthermore, offset planes are determined for color channels (or filter bands) and used to generate an auto-focus model. During subsequent scans, the auto-focus model can be used to quickly estimate the focal plane of interest for each color channel (or filter band) rather than re-employing the non-synthetic and/or synthetic auto-focus techniques.
Auto-focus methods and systems for multi-spectral imaging
Techniques for acquiring focused images of a microscope slide are disclosed. During a calibration phase, a “base” focal plane is determined using non-synthetic and/or synthetic auto-focus techniques. Furthermore, offset planes are determined for color channels (or filter bands) and used to generate an auto-focus model. During subsequent scans, the auto-focus model can be used to quickly estimate the focal plane of interest for each color channel (or filter band) rather than re-employing the non-synthetic and/or synthetic auto-focus techniques.
DUAL SYSTEM ON A CHIP EYEWEAR
Eyewear devices that include two SoCs that share processing workload. Instead of using a single SoC located either on the left or right side of the eyewear devices, the two SoCs have different assigned responsibilities to operate different devices and perform different processes to balance workload. In one example, the eyewear device utilizes a first SoC to operate the OS, a first color camera, a second color camera, a first display, and a second display. A second SoC is configured to run computer vision (CV) algorithms, visual odometry (VIO), tracking hand gestures of the user, and providing depth from stereo. This configuration provides organized logistics to efficiently operate various features, and balanced power consumption.
DUAL SYSTEM ON A CHIP EYEWEAR
Eyewear devices that include two SoCs that share processing workload. Instead of using a single SoC located either on the left or right side of the eyewear devices, the two SoCs have different assigned responsibilities to operate different devices and perform different processes to balance workload. In one example, the eyewear device utilizes a first SoC to operate the OS, a first color camera, a second color camera, a first display, and a second display. A second SoC is configured to run computer vision (CV) algorithms, visual odometry (VIO), tracking hand gestures of the user, and providing depth from stereo. This configuration provides organized logistics to efficiently operate various features, and balanced power consumption.
Recovery of hyperspectral data from image
A method for approximating spectral data, the method comprising using at least one hardware processor for: providing a digital image comprising data in a first set of spectral bands; providing a dictionary comprising (a) signatures in a second set of spectral bands and (b) values in said first set of spectral bands, wherein said values correspond to the signatures, and wherein said first and second sets of spectral bands are different; and approximating, based on the dictionary, data in said second set of spectral bands of said digital image.
Generate super-resolution images from sparse color information
Techniques for generating a high resolution full color output image from lower resolution sparse color input images are disclosed. A camera generates images. The camera's sensor has a sparse Bayer pattern. While the camera is generating the images, IMU data for each image is acquired. The IMU data indicates a corresponding pose the camera was in while the camera generated each image. The images and IMU data are fed into a motion model, which performs temporal filtering on the images and uses the IMU data to generate a red-only image, a green-only image, a blue-only image, and a monochrome image. The color images are up-sampled to match the resolution of the monochrome image. A high resolution output color image is generated by combining the up-sampled images and the monochrome image.
APPARATUS HAVING HYBRID MONOCHROME AND COLOR IMAGE SENSOR ARRAY
There is provided in one embodiment an apparatus having an image sensor array. In one embodiment, the image sensor array can include monochrome pixels and color sensitive pixels. The monochrome pixels can be pixels without wavelength selective color filter elements. The color sensitive pixels can include wavelength selective color filter elements.
APPARATUS HAVING HYBRID MONOCHROME AND COLOR IMAGE SENSOR ARRAY
There is provided in one embodiment an apparatus having an image sensor array. In one embodiment, the image sensor array can include monochrome pixels and color sensitive pixels. The monochrome pixels can be pixels without wavelength selective color filter elements. The color sensitive pixels can include wavelength selective color filter elements.
CAMERA MODULE
According to an embodiment of the present invention, disclosed is a camera module comprising: a light output unit for outputting an optical signal to an object; a sensor for receiving an optical signal reflected from the object; and a control unit for acquiring distance information about the object by using the phase difference of the received optical signal, wherein the sensor includes a non-extraction area from which the phase different is not extracted and an extraction area from which the phase difference is extracted and the control unit stops timing control for the received optical signal in the non-extraction area.