H04N13/239

DEVICE AND METHOD FOR DETECTING THE SURROUNDINGS OF A VEHICLE

A device for detecting the surroundings of a vehicle and a method for detecting the surroundings, and a vehicle designed to carry out said method comprise a camera module, a camera control apparatus, an analysis unit and an illumination device. The illumination device is formed by a matrix headlight of the vehicle and is designed such that it can project a light pattern into the surroundings. The projected light pattern is imaged in the detection region of the camera module and the 3D position of measurement points formed by the light pattern in the surroundings is determined by the analysis unit. However, the illumination device projects the light pattern only into regions of the surroundings in which the analysis unit has ascertained, based on image data, a value that is critical for 3D position determination.

DEVICE AND METHOD FOR DETECTING THE SURROUNDINGS OF A VEHICLE

A device for detecting the surroundings of a vehicle and a method for detecting the surroundings, and a vehicle designed to carry out said method comprise a camera module, a camera control apparatus, an analysis unit and an illumination device. The illumination device is formed by a matrix headlight of the vehicle and is designed such that it can project a light pattern into the surroundings. The projected light pattern is imaged in the detection region of the camera module and the 3D position of measurement points formed by the light pattern in the surroundings is determined by the analysis unit. However, the illumination device projects the light pattern only into regions of the surroundings in which the analysis unit has ascertained, based on image data, a value that is critical for 3D position determination.

COLOR STEREO CAMERA SYSTEMS WITH GLOBAL SHUTTER SYNCHRONIZATION
20230239452 · 2023-07-27 ·

Stereo imaging systems and devices are disclosed. A stereo imaging system can include one or more stereo imaging modules and an image processing module connected to the one more stereo imaging modules by a coaxial cable that carries two-way communication signals and transfers electrical power from the image processing module to the stereo imaging modules. The stereo imaging modules each include a plurality of image sensors positioned to capture images of at least partially overlapping fields of view, and processing circuitry configured to transmit the captured images to the stereo imaging module via the coaxial cable. The processing module includes processing circuitry configured to receive and process the captured images, and power circuitry configured to provide electrical power to the stereo imaging module via the coaxial cable. The plurality of image sensors may be color image sensors configured to collect color images for stereo image processing.

COLOR STEREO CAMERA SYSTEMS WITH GLOBAL SHUTTER SYNCHRONIZATION
20230239452 · 2023-07-27 ·

Stereo imaging systems and devices are disclosed. A stereo imaging system can include one or more stereo imaging modules and an image processing module connected to the one more stereo imaging modules by a coaxial cable that carries two-way communication signals and transfers electrical power from the image processing module to the stereo imaging modules. The stereo imaging modules each include a plurality of image sensors positioned to capture images of at least partially overlapping fields of view, and processing circuitry configured to transmit the captured images to the stereo imaging module via the coaxial cable. The processing module includes processing circuitry configured to receive and process the captured images, and power circuitry configured to provide electrical power to the stereo imaging module via the coaxial cable. The plurality of image sensors may be color image sensors configured to collect color images for stereo image processing.

SEPARABLE DISTORTION DISPARITY DETERMINATION
20230007222 · 2023-01-05 ·

Systems and methods for determining disparity between two images are disclosed. Such systems and methods include obtaining a first pixel image of a scene from a first viewpoint, obtaining a second pixel image of the scene from a second viewpoint (e.g., separate from the first viewpoint in a camera baseline direction such as horizontal or vertical), modifying the first and second pixel images using component-separated correction to create respective first and second corrected pixel images maintaining pixel scene correspondence in the camera baseline direction from between the first and second pixel images to between the first and second corrected pixel images, determining pixel pairs from corresponding pixels between the first and second corrected pixel images in the camera baseline direction, and determining disparity correspondence for each of the determined pixel pairs from pixel locations in the first and second pixel images corresponding to respective pixel locations of the pixel pairs in the first and second corrected pixel images.

SEPARABLE DISTORTION DISPARITY DETERMINATION
20230007222 · 2023-01-05 ·

Systems and methods for determining disparity between two images are disclosed. Such systems and methods include obtaining a first pixel image of a scene from a first viewpoint, obtaining a second pixel image of the scene from a second viewpoint (e.g., separate from the first viewpoint in a camera baseline direction such as horizontal or vertical), modifying the first and second pixel images using component-separated correction to create respective first and second corrected pixel images maintaining pixel scene correspondence in the camera baseline direction from between the first and second pixel images to between the first and second corrected pixel images, determining pixel pairs from corresponding pixels between the first and second corrected pixel images in the camera baseline direction, and determining disparity correspondence for each of the determined pixel pairs from pixel locations in the first and second pixel images corresponding to respective pixel locations of the pixel pairs in the first and second corrected pixel images.

IMAGE CAPTURE DEVICE WITH A SPHERICAL CAPTURE MODE AND A NON-SPHERICAL CAPTURE MODE
20230007173 · 2023-01-05 ·

An image capture device may switch operation between a spherical capture mode or a non-spherical capture mode. Operation of the image capture device in the spherical capture mode includes generation of spherical visual content based on the visual content generated by multiple image sensors. Operation of the image capture device in the non-spherical capture mode includes generation of non-spherical visual content based on visual content generated by a single image sensor.

IMAGE CAPTURE DEVICE WITH A SPHERICAL CAPTURE MODE AND A NON-SPHERICAL CAPTURE MODE
20230007173 · 2023-01-05 ·

An image capture device may switch operation between a spherical capture mode or a non-spherical capture mode. Operation of the image capture device in the spherical capture mode includes generation of spherical visual content based on the visual content generated by multiple image sensors. Operation of the image capture device in the non-spherical capture mode includes generation of non-spherical visual content based on visual content generated by a single image sensor.

USING 6DOF POSE INFORMATION TO ALIGN IMAGES FROM SEPARATED CAMERAS

Techniques for aligning images generated by an integrated camera physically mounted to an HMD with images generated by a detached camera physically unmounted from the HMD are disclosed. A 3D feature map is generated and shared with the detached camera. Both the integrated camera and the detached camera use the 3D feature map to relocalize themselves and to determine their respective 6 DOF poses. The HMD receives the detached camera's image of the environment and the 6 DOF pose of the detached camera. A depth map of the environment is accessed. An overlaid image is generated by reprojecting a perspective of the detached camera's image to align with a perspective of the integrated camera and by overlaying the reprojected detached camera's image onto the integrated camera's image.

USING 6DOF POSE INFORMATION TO ALIGN IMAGES FROM SEPARATED CAMERAS

Techniques for aligning images generated by an integrated camera physically mounted to an HMD with images generated by a detached camera physically unmounted from the HMD are disclosed. A 3D feature map is generated and shared with the detached camera. Both the integrated camera and the detached camera use the 3D feature map to relocalize themselves and to determine their respective 6 DOF poses. The HMD receives the detached camera's image of the environment and the 6 DOF pose of the detached camera. A depth map of the environment is accessed. An overlaid image is generated by reprojecting a perspective of the detached camera's image to align with a perspective of the integrated camera and by overlaying the reprojected detached camera's image onto the integrated camera's image.